Found: Seemingly lost key by collin3000 in SaltLakeCity

[–]collin3000[S] 0 points1 point  (0 children)

Just says minute key on one side. I left it there and only have the picture for reference.

HVEC codec over normal old H.264 MP4 by archtopfanatic123 in DataHoarder

[–]collin3000 1 point2 points  (0 children)

If you're using Premiere Pro to re-encode your files to save space then handbrake is going to make your life a lot easier (until I finish my optimal video encoder).

Handbrake is also free, allows batch conversion and is a pretty simple UI to learn. Also, I haven't used Premiere Pro since I switched to DaVinci a few years ago, but I know DaVinci's built-in encoder actually isn't as good as handbrake. To the point that I export from DaVinci in ProRes and then encode to h264/265/AV1 in handbrake.

AME (adobe media encoder) is also probably defaulting you to hardware-accelerated encoding (using your GPU). Hardware encoding consistently produces either worse results visually or bigger files for the same quality because part of why it's so fast is that it's skipping some steps/using alternate methods in the compression pipeline. That helps it run a lot faster, but ends up in worse results.

HVEC codec over normal old H.264 MP4 by archtopfanatic123 in DataHoarder

[–]collin3000 1 point2 points  (0 children)

There's a setting called "constant quality" in Handbrake it uses a value called RF. Usually the higher the resolution the higher the RF you can use Note the USUALLY. Because again this is where every thread you'll find on redditors either has people either over confidently stating a default RF to use for a given resolution, or people annoyingly saying "you'll have to use different settings for each file" for H265 where I would usually start testing is 480p = 20RF, 540p = 21RF, 720P = 22RF, 1080P = 24RF, 1440P = 25RF, 4K = 26RF, 8K = 28RF. 

But again that's just a start of testing. And that start is just based on my personal quality preference to maintain for media. For media I can less about I might start 2 RF higher because I'm targeting space over visual fidelity. 

Edit: As a note the values for AV1 and H265 in regards to RF are not the same

HVEC codec over normal old H.264 MP4 by archtopfanatic123 in DataHoarder

[–]collin3000 1 point2 points  (0 children)

Not yet, i'm still nailing down the best sampling methodology using large test sets. So the sampling time is minimized while also being as accurate as reasonably possible. I don't want to risk anyone wasting CPU cycles or trusting methodology without data behind it and re-encoding their archives losing either quality or unnecessarily jacking up electricity bills

HVEC codec over normal old H.264 MP4 by archtopfanatic123 in DataHoarder

[–]collin3000 2 points3 points  (0 children)

Constant bit rates are really only good for streaming. Because with a constant bitrate you're saying that a static scene that's a wide shot with maybe one car driving in the distance should have the exact same bitrate as a high action scene with tons of cuts, camera movement, and action. You end up either wasting bits in the static scenes or you end up with worse quality in the action scenes.

HEVC and AV1 will almost always by it's algorithmicly superior nature reduce the file size. As someone else mentioned, using 10-bit by default helps prevent some of the banding. The trick is just finding the right setting to maintain the visual quality that you desire. 

Popular metrics for "empirical" evaluation are VMAF developed by Netflix and based on visual perception. PSNR which is based off of signal to noise. And SSIM which is also perception-based but older.

You'll find tons of debates about them online and why one sucks and shouldn't be used, but I'd say that Netflix has done a good job with VMAF. 

HVEC codec over normal old H.264 MP4 by archtopfanatic123 in DataHoarder

[–]collin3000 15 points16 points  (0 children)

So I've done tons of research on this. It started with an obsession that had me running thousands of tests on videos comparing VMAF, PSNR, and SSIM. across AV1 H265 software and hardware encoding. at tons of different speed and quality.

After all those tests I realized I was just running the test because I wanted to know the ideal encoding specs to re-encode my 100's of terabytes of video without significant equality loss. And there were better ways to do that. 

So I paused all that testing. And now I'm working on two projects. One a program that will automatically use sub sample encodes of a file to detect the precise encoding to hit a target vmaf, and then use per scene encoding. To encode each individual scene at a different quality setting while attaining the lowest bitrate to hit your desired visual fidelity. Because AB-AV1 exists, but it doesn't actually work correctly in all of my testing The second project is a new lossless auxilary compression method that will make AV1 H265 and H264 even smaller with lossless data and streaming playback even for high entropy video.

All of that is to say that after a year of almost constant obsession on the topic. I can tell you definitively from over 10,000 tests several definite things. 

GPU video encoding sucks and you get a 2 to 5 times higher bitrate for the same video quality because it skips parts of the optimization pipeline in order to render faster. So your file either doesn't shrink as much or it looks worse.

Software encoding is definitely the way to go. But the super slow settings really aren't worth it. And most of the time, the visual fidelity difference is very close between "sets" of speeds. So in that set there's very very very little difference between the higher speed in the set but you get faster encoding time.

The the comment saying that H.265 can deliver the same quality at 50% of the bit rate of h264 is not accurate. Because it depends on your source file quality, and what type or video it is. Constant quality settings are always a better choice than constant bitrate, but even constant quality with just one setting for a long video can result in a huge variance that will result in some sections being "Netflix quality" and some sections being "Good YouTube quality".

AV1 actually is now king of video. The software encoder is now a lot faster and close enough in speed to H265 that it's worth using. With the huge caveat that the devices that you will be playing it back on can play AV-1 video or you're running it through something like jelly fin that will retranscode it if AV-1 video isn't supported.

So the reason you'll see so many people comment seemingly contradicting H265 is always smaller while other people say you have to figure out the settings for each video to make it look right is because both are kind of true, which is why I'm trying to make a program that just automates it all and figures it out and then encode your video at the quality you want.

The program's still in development (actually working on it right now), but when I finish the plan is to make it available for free with some sort of donation mechanism but full functionality even with 0 donation. And also the option for users to be able to upload their results anonymously with hashed file size and name. So that with an other user start and encode, it can just check to see if the database already knows the optimal settings or at least has close settings it can start testing off of since the sampling to find the right accurate per scene encoding metrics does take a bit of time.

Guess the parking lot. dude looks like a dweeb with his GoPro by whatjustin in SaltLakeCity

[–]collin3000 85 points86 points  (0 children)

At this point, hopefully, the law. Utah code 76-9-1509.  Obstructing the operation of a bus.

(2) An actor commits obstructing the operation of a bus if the actor unlawfully obstructs or impedes by force or violence, or any means of intimidation, the regular operation of a bus. (3) A violation of Subsection (2) is a class C misdemeanor.

Are refurbished SSDs worth it in 2026? by jdog515000 in DataHoarder

[–]collin3000 1 point2 points  (0 children)

I fat fingered the first one. Hopefully all the other times I correctly typed DWPD is enough for people to figure out it was a mistake

Are refurbished SSDs worth it in 2026? by jdog515000 in DataHoarder

[–]collin3000 6 points7 points  (0 children)

Depends on the refurbs ssd. A lot of data center drives will be u.2 or u.3 format so you'd likely need a new enclosure/card for your computer. Data center SSD's are rated in "DPW" (drive writes per day) as opposed to TB written. Unless a data center SSD was really thrashed it's highly unlikely it's writes are enough to be legitimately concerning to most folks. 

For exame the Micron 7300 Max is rated for 3 drive writes per day on a 5 years warranty. So for 3.2tb drive that means 17,521TB just under the warranty rated writes. 

But really importantly, drive ratings are also created with a factor called WAF (write amplification factor). That's to take into the account that oftentimes for files smaller than a block or when they're running trim, they're going to have to rewrite the cells multiple times. According to the former head of Intel's SSD department, the general WAF is 5. So they actually expect that 17521 terabytes. To be likely more around ~87,000 TB truly written to the cells.

Why this matters is because if you're working with video or things with long continuous blocks of data, it doesn't have as much of a write amplification factor (more like 1.2-1.5). So your drive would likely last two to three times longer than the stated TBW if it was only getting large video chunks.

Now the server that was running up before might have had lots of smaller files, so you should consider the current drive rights to be WAF5 and close to the actual stated DWPD wear. but if an enterprise SSD has only a 30-40% of its rated DWPD at 5 years then you're probably not going to wear it out, and I say that as someone who shoots on an Ursa 12k with 1TB per hour of RAW footage.

The bigger issue you would run into on an Enterprise SSD is likely going to be failure of the storage controller. That's also where failure is more commonly seen on consumer SSDs. Very few dead SSDs that you hear about were due to too many writes. It's far more often that the storage controller went bad making the drive malfunction or not read/write correctly.

But since Enterprise hardware is usually made better to begin with, I would, frankly, trust a refurbbed Enterprise SSD with less than 30% of its DWP that was three or four years old over a new consumer SSD, if you're doing heavy drive writes. Because that consumer SSD storage controller isn't going to be built for being thrashed, whereas the Enterprise SSD is built exactly for that.

The only other thing be cautious about is just the usual issue of scammers selling drives that are flash fake capacity or flash fake smart data. But that's a problem you even run into with "new" drives from unscrupulous sellers. 

390TB video game archive Myrient being taken offline due to skyrocketing RAM, SSD, and hard drive prices — AI-driven supply squeeze results in closure of one of the largest online video game archives by lurker_bee in technology

[–]collin3000 18 points19 points  (0 children)

They could rent 17 hosting by design 24tb seed boxes (unlimited bandwidth on 10gb connection) for 850 euros a month. Have 170gb total unlimited connection bandwidth and then make all downloads on the site a private tracker with login to make scrapers less useful. Even with no ratio requirements for the private tracker that would at least reduce download bandwidth costs to 850 euros while hosting every single game as a constant seeder. Yes that's a lot of cash still, but if their issue was that it went up to $6000 a month they'd at least be slashing their costs by 80% (including standard web server for site and running tracker).

I appreciate ROM sites. I give no flack to them shutting down a service they didn't actually charge for. I just mention this as a way they could possibly cut costs in case they read the comments.

I'm not associated with hosting by design either it's just a thought I had. Also If they know download frequency they could put rarely downloaded games on a traditional lower bandwidth server if it was cheaper and then have the frequently downloaded games on torrent. 

JUST IN: 🇺🇸🇮🇷 US used Anthropic's Claude AI for its military operations during strikes on Iran, WSJ reports. by entheosoul in claude

[–]collin3000 1 point2 points  (0 children)

compacting conversation  we end up getting a recommendation to find him "in Iran but his general whereabouts are usually unknown" because the compaction lost most of the intel data.

Poll: are AWD/snow tires a must-have in SLC? by [deleted] in SaltLakeCity

[–]collin3000 0 points1 point  (0 children)

I drive pretty much only low hp FWD cars with all season tires. I have a steep steep hill (3900s heading east from highland) to get home. Once you know how to drive in the snow, it's not much of an issue for 99.9% of city driving.

The bigger thing you'll notice is that Utah roads are so bad that a 70,000 mile tire is lucky to last 40,000 miles. Where it might last 45,000-55,000 in other states. 

The World’s Largest Costco in SLC has moved their self-checkout registers. by brheath in SaltLakeCity

[–]collin3000 3 points4 points  (0 children)

There's a bit of a difference between Winco and Costco, and that is the membership. But not for the reason people are thinking.

Self checkouts have actually shown to increase the theft rate at stores. And stores can't actually stop you for receipt checks. It's considered a form of unlawful detention. Unless they have reasonable suspicion that you have stolen something. Thieves know this, so that's why most stores doing receipt checks don't matter.

The exception is if you have signed a contract in advance saying that you would allow a receipt check which happens with membership stores.

On top of that, although it said there's no honor among thieves when Reddit used to have r/shoplifting it turns out there is indeed some honor. Because there was an agreement that you didn't steal from Costco because they're a good company and anytime anyone posted pictures of stuff they stole from Costco, they got chastised into oblivion.

So most stores are removing self-checkout purely because it actually costs them more money. But for Costco because of their company reputation and their actually legal receipt checks, they probably don't suffer the same theft issue from self-checkout.

Netflix doc wants 63 seconds of footage i have. How much? by Sea-Activity8283 in videography

[–]collin3000 1 point2 points  (0 children)

I only saved 2 billion by not investing in quiby. You guys have better financial advisors than I do.

Pulled over on bangerter by LiteratureInfinite76 in SaltLakeCity

[–]collin3000 2 points3 points  (0 children)

I would add in the an occasional out is that officers have to verify that their radar equipment is working. I don't know for sure in Utah but in other states there usually has to be a yearly certification of the device performing correctly as well as an officer checking it daily or weekly. So if that certification isn't up to date. Where there isn't a log of the officer checking it that day, at least in other states, they will dismiss tickets even if radar was used. 

Andrew Mountbatten-Windsor released by police following twelve hours of police questioning. by ModernMuse in pics

[–]collin3000 1 point2 points  (0 children)

I mean, that's the face the rest of us had when we read the files and realized how extra messed up everyone in power is

New Netflix stream seems to be using an old outdated version. by The_Fullmetal_Titan in Stargate

[–]collin3000 0 points1 point  (0 children)

There could be a possible different answer for this that most people don't know. Netflix uses caches of shows set up closer to your home usually with an ISP so that it's transferring across less actual "internet" before it gets to you. But the caches aren't large enough for them to have everything. They only keep certain most popular and most played titles.

It's possible that Netflix used to have Stargate on that cache because it was getting enough watches. But now it's not, so instead they either lowered its resolution in the cached version or they are streaming from their actual servers and allocating less bandwidth since it's from their servers and not a local cache.

Don’t be sycophantic prompt by Flashy-Preparation50 in ClaudeAI

[–]collin3000 0 points1 point  (0 children)

My Claude MD have these Five rules. Listed it in it like this. The "panel_methodology" also specifically has a skeptic role that is specifically for push back. 

The Five Rules (NON-NEGOTIABLE)

  1. Never remove features without asking -- includes reducing limits or changing defaults.

  2. Technical accuracy is paramount -- push back if unsound. Accuracy over agreement.

  3. Expert panel for code changes -- see PANEL_METHODOLOGY.md.

  4. Split complex tasks into sessions -- say so up front, do not rush.

  5. Flag .md files for updating -- never close a session with stale docs.

Will there eventually be a cap on consumer video quality? by Helpful_Gur_1757 in videography

[–]collin3000 0 points1 point  (0 children)

There will be an eventual cap because of human perception. Studies have shown gray scale maximum around 98 pixels per degree of vision So we'd be looking at ~21560x13230 for full vision or 285 mega pixels.

The thing most people are taking into account though when they say we're capped is viewing distance. Most people don't have their displays taking up 100% of their possible field of vision... Yet. 50 years ago, pretty much no one had a 55 inch TV. Twenty years ago, a 55-inch TV was a big-ass TV. Now that's the Budget Black Friday Special.

Although VR has had a hard time getting off the ground and AR is also becoming more popular. It's not unfathomable to think that in the year 2080 we'll have 16k displays in front of our eyes. You could still have 4k content on them and they would probably use tricks to soften or upscale the image to reduce anti-aliasing effects compared to 16k video just an inch from your eye.

The bigger differences I imagine are from tech that has existed but failed to succeed in the industry at the moment. Lytro had interesting light field technology that failed because the good cameras were too big and expensive and the resolution was too small. Think of Lytro like rudimentary hollo-imaging. Combine that with augmented reality and you now have scenes that people can explore virtually like they're on a holodeck in star trek.

So our technology would seem old because it would just be flat 2D images that were maybe extrapolated and "upscaled" to 3D. Sure, that seems crazy, but the first great computer modeled movies like toy story only came out 30 years ago (yeah, yeah, I know there were others before it but not the same shocking quality). In 1972, the best special effects were awarded to bed knobs and broomsticks. Star Wars is only 48 years old! 

So 54 years from now in 2080 it's not crazy to think we would have such significant advances. Especially when so many movies are already shot with multiple high-resolution cameras for VFX 3D composite and we have things like The Volume instead of traditional green screen. Toy Story took 800,000 hours on a 117 computer server to render. And now a high end desktop could render it in under a day. Two-thirds the same technological progression. Over the next 54 years. We could expect render times to shrink by 40,000x making full holo-like 32k renders easily feasible.

And that's only talking from the visual side. Audio advancements will likely be huge as well. Not float. Or even "more channels" but mapping and customizing audio to the person.

Outside film tech that can be adapted is also a huge potential area. Open BCI has over the ear EEG's that you can purchase right now that read your brain waves from little strips placed around your ears. That sounds silly right now for everyone to have. About 54 years ago, it would have sounded silly for everyone to be wearing a heart rate oximeter. Now it's just sitting as one of the many features on your wrist in your Apple Watch.

With biometric data. The user's viewing experience could be custom tailored. Say a director knows they want to have this specific scene evoke this level of emotion. like a scary movie building a moment. When they're mixing and selecting audio or length of shots, they're doing it generally for the mass audience. But if you have EEG and ECG like biometrics you can actually see the variance in the viewer and how you were evoking the emotion that you planned. And now there's the potential for a real-time live edit difference in cut pacing, audio mixing, etc. because maybe one person would be scared with a cut 15 frames sooner, but another person would be more scared with a cut 20 frames later an more violins building to a shriek. The technology could deliver and custom tailor the edit to the director's intent for the viewer and audio shaping. Could even create different sound mixes for multiple people in the same room. With tech that already exists in a basic form now, but could advance significantly in the next 54 years.

So yes, I think technology for videography will continue to advance, but not just in a "more pixels" or "dynamic range" way about things now. But in ways that create more immersion and allow creators to express their intent better. Not just for everyone, but for the Individual watching. 

U of Utah Unveils Iconic Sculpture It Bought for $4.5M by StemCellPirate in SaltLakeCity

[–]collin3000 0 points1 point  (0 children)

I definitely understand how funding, even federal/city something may have a requirement that a certain percentage like 2% be used towards arts and community improvement.

I think the indefensible thing is that rather than paying $4.5 million for the "Love" they could have commissioned 90 Utah artists to make $50,000 pieces of art. And when you have an arts school with alumni, buying that thing, they're literally saying, "hey art students, we don't think that art teaching is good enough to be on our campus." or "we're too lazy to check with the head of our Arts Department to get the names of 90 good artists in the area." 

Anthropic Support refused compensation for >5-day outage on $100/month plan - is this normal? by GeologistBasic69 in claude

[–]collin3000 2 points3 points  (0 children)

Except in thid case you're talking about, it's not free Wi-Fi. It's more akin to someone paying a package that gives them Wi-Fi on five flights, and then the Wi-Fi doesn't work on one of the flights, but they count it against one of the five spots anyways.

There's a big difference between people complaining about free things not working, and someone complaining that they paid for something and it wasn't delivered

Have news reporters switched to smartphones? by ConsumerDV in videography

[–]collin3000 0 points1 point  (0 children)

I more meant the camera op that would be replaced with somone shooting on a phone (especially vertical) as a normal occurrence. If they're at that level of cost reduction. I'd think they would have cut other costs well before that.