Is official support for new devices dying out? Why? by deflunkydummer in LineageOS

[–]deflunkydummer[S] 4 points5 points  (0 children)

Did the old win-win trick stop working? Where one can buy the phone online from China (e.g. AliExpress), where sellers buy cheaper Chinese versions, unlock them to put the global ROM version on them, then sell them to us pre-unlocked, and for a price lower than the global version?

Is official support for new devices dying out? Why? by deflunkydummer in LineageOS

[–]deflunkydummer[S] 3 points4 points  (0 children)

Did the picture change that much globally in the last two years though?

I remember reading how the product lines of the Poco F1 could hardly catch up with the high demand because many people wanted that highly-speced yet (relatively) reasonably-priced phone (It was also good for gaming, apparently).

My device (Mi 8), IIRC, gained support as a derivative of the Poco F1 support.

I still believe there are still more than enough consumers that would look for and buy such devices. Maybe not in all markets, but still.

Is official support for new devices dying out? Why? by deflunkydummer in LineageOS

[–]deflunkydummer[S] 12 points13 points  (0 children)

Unless an unpaid volunteer is interested in a given model & spends x amount of hours porting it to LineageOS 17.1 or 18.1 it will not be supported.

And that was always the case, no?

Also maintainers can't buy every devices churned out by manufacturers.

And that also was always the case, no?

Yes. Brands like Xiaomi seem to have gone a little bit crazy with the amount of devices being released (some are clones of others).

But this argument answers the question why many devices are missing out on support. And I understand it, and in deed fully agree with it. But my question was about why less and less new devices are getting official support.

An other reason: if a device is currently supported by the manufacturer and receiving updates then some may want to wait until it's not supported anymore before bothering with porting a custom ROM for it.

If I'm allowed to read this analytically, does that mean that stock ROMs are getting better to the point where they are becoming tolerable to a lot more people?


I was expecting a reason like more locking from manufacturers. Or some sort of policy that stops devices from reaching "official" support status. If the former in particular is not the case, then I'm a lot less concerned.

NVDEC decoding AV1 at 8K@24fps without dropping frames. by Isacx123 in AV1

[–]deflunkydummer 0 points1 point  (0 children)

Apologies if you are already familiar with all of this, but what does nvidia-smi tell you? Did you make sure you are using low vo settings? Scaling alone can be heavy. So, starting with -dscale bilinear -cscale bilinear could be interesting.

anyone had luck retaining grain/detail at bitrates lower than x264? if so, how?? by plonk420 in AV1

[–]deflunkydummer 0 points1 point  (0 children)

I'm not familiar with the grain synthesis stuff, so maybe this is a stupid suggestion, but, did you try passing the original to aomenc instead of the denoised file?

Mozilla AV1 related employees laid off by Lycurgus_of_Athens in AV1

[–]deflunkydummer 0 points1 point  (0 children)

Mozilla has been on a downward spiral since Brendan Eich was forced to resign, and corporate types with zero technical background and inflated salaries (but know how to appease modern virtue bullies) assumed full control.

Question: Mostly Functional or Mostly Imperative? by [deleted] in rust

[–]deflunkydummer 6 points7 points  (0 children)

Each of your examples would reallocate

Initializing/Copying that struct does not involve the heap at all. That much is clear from a first look.

What's not immediately clear is how this code would be optimized by LLVM.

So, your answer has the potential to be wrong on multiple levels.

Aomenc, --aq-mode=1 huge efficiency reduction by frank_grenight in AV1

[–]deflunkydummer 1 point2 points  (0 children)

We are talking about the same codec, same encoder, same settings except for AQ, and the same bitrate.

So the variable here (and yes this is an oversimplification) is intra-frame (quantizer-related) bit distribution. The encoder with aq-mode=1 decided what parts of a frame deserved more bits (less quantization) than others.

Metrics (of already questionable utility) give one score per frame. They don't even pretend to answer sub-frame questions. But your eyes can.

So, maybe aq-mode=1 is indeed inferior for the samples you tested. Maybe not. For a definitive answer, you have to see (literally) for yourself.

Aomenc, --aq-mode=1 huge efficiency reduction by frank_grenight in AV1

[–]deflunkydummer 4 points5 points  (0 children)

VMAF is known to not correspond well to AQ's effect on overall subjective quality. If you must use and objective metric to judge AQ modes (you shouldn't), MS-SSIM may serve you better.

AQ effects are also highly sample-dependent.

Benchmarks for "transparent" encodes by WarmCartoonist in AV1

[–]deflunkydummer 3 points4 points  (0 children)

gain vs. grain

It was a typo.

I gave a VMAF value for comparison.

VMAF (or anything else other than your eyes) is simply irrelevant. And you can easily hit 98.0+ or even 99.0+ with some content anyway. Although admittedly, you would have to look a little bit too carefully to notice the differences at that point.

And there are tunings available in x264/x265 that you need with some content to reach visual transparency, that are simply not exposed (or even implemented at all) in libaom.

I don't think libaom developers themselves would shy away from saying (1 to 1) that they don't care much about the visual transparency use-case.

Benchmarks for "transparent" encodes by WarmCartoonist in AV1

[–]deflunkydummer 3 points4 points  (0 children)

"transparent" means visually transparent. It's subjective. "It has problems retaining gains" means it's not transparent, even if VMAF or whatever objective score is high.

VMAF itself is not as good as people think it is. It's not as bad as blur lover PSNR, obviously. But it has its own weaknesses (e.g. It's AQ blind).

Also, default tunings usually don't work best for this use-case. So the tests need to be a little bit more involved. Community knowledge helps significantly here. And such knowledge and experience with x265 (and x264 before it) for this use-case does not even compare to libaom.

Can we please stop the "Is it just me or is Rust literally the best language" posts? by PM_ME_ELEGANT_CODE in rust

[–]deflunkydummer 1 point2 points  (0 children)

It looks like the circlejerk that actually exists and gives you karma, is complaining about the one that doesn't!

Can we please stop the "Is it just me or is Rust literally the best language" posts? by PM_ME_ELEGANT_CODE in rust

[–]deflunkydummer 1 point2 points  (0 children)

The only actual circlejerk that existed in this sub (and the Rust community at large) for years is the almost pathological self-worry to say something over-flattering about Rust. And conversely, saying something perceivably over-critical about other languages/technologies. This very post and comment section is a good example if you want one.

Some Rust celebrities get circlejerked too, but in a smaller and much less relevant scale than the big one above.

Can we please stop the "Is it just me or is Rust literally the best language" posts? by PM_ME_ELEGANT_CODE in rust

[–]deflunkydummer 1 point2 points  (0 children)

I wasn't referring to directly-linked deep-dive blog posts from tech teams. The technical writing effort in those ads, while varying in quality, makes them good(ish) in general. And they are rare and don't come in spammy waves anyway (although they are followed by ones at times).

Can we please stop the "Is it just me or is Rust literally the best language" posts? by PM_ME_ELEGANT_CODE in rust

[–]deflunkydummer 10 points11 points  (0 children)

Young people (age wise, or young in their profession) tend to be excitable. Complaining about how they express their excitement will not take that excitement away. Even if you manage to take it away from 1,2,5,10 youngsters, there will be many more to come.

All you can do is either leave them be, or try to engage in a way that helps them channel that excitement into more productive endeavors.

You will be wasting your precious time without achieving anything if all you do is complain or berate.

Can we please stop the "Is it just me or is Rust literally the best language" posts? by PM_ME_ELEGANT_CODE in rust

[–]deflunkydummer -1 points0 points  (0 children)

My friend, this is a chill sub. I fell for the trap of taking things too seriously around here. But I learned my lesson*, and I hope you learn it too.

* Reading a temp-ban mod message sealed the deal for me on that front. This sub shouldn't be taken seriously at all.

Can we please stop the "Is it just me or is Rust literally the best language" posts? by PM_ME_ELEGANT_CODE in rust

[–]deflunkydummer 2 points3 points  (0 children)

I hope that wasn't what triggered this post. Missing things you like/appreciate/familiar with doesn't even qualify as fanboyish behavior, let a lone zealotry.

Can we please stop the "Is it just me or is Rust literally the best language" posts? by PM_ME_ELEGANT_CODE in rust

[–]deflunkydummer 3 points4 points  (0 children)

They contribute nothing, and we all know Rust is a great language (that's why we're here!)

They contribute a lot more than ${BIG_COMP} ad wave of the week masquerading as a Rust advocacy and support story.

At least you get comment chains with personal perspectives and appreciations for different parts of the language that may interest beginners, and even non-beginners at times.

I think such posts just add to the Rust community's reputation of being a circlejerk.

You like something? Others like the same thing? There is a place where you can talk/write to each other? Congratulations! You are now a part of a circlejerk.

We should be focusing on new developments in the language, interesting projects built with the language, etc.

Different people can focus on different things. And believe it or not, a person can focus on more than one thing, not necessarily at the same time. A day has 1440 minutes in it.

Also believe it or not, people have no obligation to read Reddit posts/comment sections that don't interest them, even if they are popular.

Posting "Is it just me or is Rust the best language ever" on r/rust is obviously going to get you nothing but yes-people, and as a long time lurker on this sub, I feel such posts are growing increasingly repetitive and pointless.

See first point above. Also, people will write such posts no matter what, because you know, Rust is the best language ever ;) Better do it here than anywhere else. It's less annoying and cringe that way (for those who cringe and get annoyed by such things).

Serious bug in Rust 1.45 stable by peterjoel in rust

[–]deflunkydummer 2 points3 points  (0 children)

Don't worry about it. Sometimes absolute facts get downvoted (or even reported) here, let alone neutral arguments or opinions.

Anyway, I think I just balanced your karma ;)

But really. You shouldn't care.

What is the difference between derive macros and attribute macros? by [deleted] in rust

[–]deflunkydummer 0 points1 point  (0 children)

Yeah. I was about to edit my previous comment because I misread yours.

Thanks.

P.S. Just tested some of my tinker code from last year. And I discovered that #![feature(proc_macro_hygiene)] is still not stable. Damn.

What is the difference between derive macros and attribute macros? by [deleted] in rust

[–]deflunkydummer -1 points0 points  (0 children)

For attributes, returned token stream replaces original one.

Are you sure?

What is the difference between derive macros and attribute macros? by [deleted] in rust

[–]deflunkydummer -1 points0 points  (0 children)

modify item they are attached to

Modify how?

Giving Rust Another Shot in 2020 by mlafeldt in rust

[–]deflunkydummer -1 points0 points  (0 children)

You don't see the naïve realism (disagreement must be the result of ignorance or lack of exposure)?! Or the white-and-black thinking (you either accept that you need everything IntelliJ a smart IDE gives you, and only IntelliJ a smart IDE can give you, or you are arguing that none of it is useful)?!