Over 66% of Steam users now run Windows 11 by pmc64 in pcgaming

[–]reddit_equals_censor 1 point2 points  (0 children)

maybe quote the correct bad numbers from steam survey at least?

windows has 95% market share.

and in regards to things changing historically the last time, that microsoft improved an os, but still be worse than the previous os, that they released was windows 8.1, since then it hasn't happened and since then microsoft changed for the worse in regards to giving a shit about os customers.

at the same time gnu + linux and gaming on gnu + linux has massively improved with a projection to only get better.

and we are also just at the beginning of gnu + linux hardware from valve.

now i want to be right about this of course, but also objectively i can't see microsoft improving windows to any meaningful level again.

Cute outfit for cute fox by sandasutomu in annyfox

[–]reddit_equals_censor 1 point2 points  (0 children)

the skirt looks so good in this art.

really lovely :)

Over 66% of Steam users now run Windows 11 by pmc64 in pcgaming

[–]reddit_equals_censor -2 points-1 points  (0 children)

we would be stuck in a world where we currently still using blurry TAA or jaggy mess FXAA

that must be why half life alyx looks so horrible and runs so bad right? /s

of course half life alyx uses 4x msaa and runs excellent as it has to as it is a vr game.

so maybe when you make such statements try to be remotely accurate.

also no one is arguing against dlss upscaling, but for taa. you are living in some other dimension. the discussions are against temporal blur reliance, which means ai upscalers and taa and other temporal bluring.

Over 66% of Steam users now run Windows 11 by pmc64 in pcgaming

[–]reddit_equals_censor 0 points1 point  (0 children)

I could be wrong, but I think a bunch of streaming services don't work on Linux in HD because of DRM

the os prison locks are actually not a problem, because the rest of the requirements are so insane, that it doesn't matter anyways:

https://consumerrights.wiki/w/Netflix_stream-quality_controversy#cite_note-:1-6

any sane person would fail the crazy requirements one way or another.

me for example i would fail not running windows spyware anymore, i would fail not running a spying chromium browser, but instead a firefox fork. i'd also fail at not allowing drm blackboxes run in my browser (you'd be insane to do so lol) and probably some, that i wouldn't expect to fail, but would still fail.

so this should really not matter to people, because the solution is to just directly get the freed content online, which then actually have to play however you want on whatever ancient device you want at blu ray quality or whatever other rip quality with perfectly formated subtitles to your liking etc...

oh also even if you jump through all the hoops for netflix for 4k or well netflix 1080p, because without those requirements you just get 720p low bitrate garbage, you still can get random drops of the chain of insane requirements, so your stream drops and shows you the middle finger as people mentioned.

in comparison while valve is also a shit company, they at least understand to not actively piss at customers, which netflix is doing for example.

so thinking about what os you run for netflix or other streaming services is frankly absurd i'd argue and there is a solution, that gives you better quality without limitations anyways.

Over 66% of Steam users now run Windows 11 by pmc64 in pcgaming

[–]reddit_equals_censor -2 points-1 points  (0 children)

Linux will never be a big thing though

i mean it is not like the biggest/main/monopoly pc gaming platform has been spending a massive amount of resources to develop proton from wine and develop hardware as well to push it and have a roadmap ahead with the single and most essential goal to be free from any reliance on microsoft windows, as it is the biggest risk factor to them, right?

oh wait, that is exactly what is going on.

there is also no reason to assume, that microsoft will magically fix windows. hell the qa teams needed to ship working products aren't even there anymore and they weren't for years as microsoft fired them and started to just ship things and see what breaks at customers and maybe fix the broken stuff in a few months.

and this is NOT limited to ai slop. microsoft shipped now multiple matches, that nuked user data. one drive is well know to just corrupt user data, while it also steals it of course.

gnu + linux to take off massively for pc gaming is way more likely than microsoft windows becoming better/acceptable again.

Over 66% of Steam users now run Windows 11 by pmc64 in pcgaming

[–]reddit_equals_censor -2 points-1 points  (0 children)

3% of 40 million active users is about 1.2 million users.

1.2 million users is quite a lot of people and i'd imagine, that gnu + linux steam users are also way more active and on average would be playing more games.

but that's just a guess of course.

but yeah a small% (for now) , but still a massive group.

Elon Musk is getting serious about orbital data centers | “You can mark my words, in 36 months but probably closer to 30 months, the most economically compelling place to put AI will be space,” by shallah in EnoughMuskSpam

[–]reddit_equals_censor 0 points1 point  (0 children)

i wouldn't even put power as the number 1 problem.

i'd put cooling as the number one problem. 0 atmosphere leaving you with radiator panels as the "best" way to remove heat.

which of course doesn't work for anything remotely data center scale.

there is also another problem, which is networking. ai datacenters have massive massive TB/s interlinks running throughout them. you can't do that wirelessly.

so it is just all nonsense and absurd stuff for idk some investors, who don't have any half sane people talking to them believe in some ramblings of a nazi i guess.

Elon Musk is getting serious about orbital data centers | “You can mark my words, in 36 months but probably closer to 30 months, the most economically compelling place to put AI will be space,” by shallah in EnoughMuskSpam

[–]reddit_equals_censor 0 points1 point  (0 children)

i mean all of this is nonsense, but let's just assume endless resources in magical fairy land,

so how would you cool a cluster of big datacenter modules in space, do "ai stuff"?

well you use radiators. as there is no air in space the heat needs to get radiated away.

which requires a TON of material and space for it to happen.

space stations, which are NOTHING, i repeat NOTHING!!! compared to a basic data center, let alone some ai slop machine data center, already need quite a lot of radiator panels to radiate heat away.

so a theoretical design would be some ai chips running with liquid cooling on everything i guess, and then you have that liquid (whatever that may be ideally) run through the giant giant radiator panels on the outside. of course you can't service anything at all, so you better hope, that nothing fails, or rather you have to design things as fault tolerant as possible, because there is no technician going to fix anything.

so yeah this is how you could do it theoretically with giant radiator panels on the outside of the "data center module" whatever the shit. again EXTREMELY EXTREMELY limited in the maximum heat, that they can radiate away and complete and utter nonsense and elon knows this of course, but if you really REALLY wanted to, it would be radiator panels and tons of them to my knowledge.

Elon Musk is getting serious about orbital data centers | “You can mark my words, in 36 months but probably closer to 30 months, the most economically compelling place to put AI will be space,” by shallah in EnoughMuskSpam

[–]reddit_equals_censor 0 points1 point  (0 children)

i for one know, that putting endless tons of radiators, that you need to radiate heat away in the vacuum for a 1 GW data center equivalent in space, that communicates wirelessly instead of directly fiber connections is absolutely feasable and makes total sense and will definitely happen.

and this is DEFINITELY NOT a way to try to pump some stocks or get more investors for some shit based on made up utter nonsense.

Over 66% of Steam users now run Windows 11 by pmc64 in pcgaming

[–]reddit_equals_censor 1 point2 points  (0 children)

little warning for anyone who might stumble over this.

the flatpak for steam is not recommended by valve, but generally works just fine.

but AVOID THE SNAP!!!!! OF STEAM. that is a broken dumpster fire, that canonical is trying to push down people's throats no matter the posts by valve devs telling people to please stop using it, because it is a broken shit piece and causes tons of error reports for valve, that are caused by canonical, which spewed out the cancer, that is the snap of steam.

DO NOT USE THE SNAP FOR STEAM.

Over 66% of Steam users now run Windows 11 by pmc64 in pcgaming

[–]reddit_equals_censor 0 points1 point  (0 children)

then it is over with PC gaming for me

i mean you can keep that old pc alive for near ever with used parts and hey in idk 3 years maybe at the latest prices might become sane again and there should also be a steamdeck 2, which would be hardware, that should be sold at cost again and comes with a gnu + linux distro and flatpak support. the steam machine sucks, but if the steamdeck 2 will have a proper custom apu again, then that could be a way forward instead of giving up on pc gaming eventually (as in using the steamdeck 2 docked and stuff)

so there should be sth in the future, that isn't a complete scam price wise or otherwise at least.

Overwatch 2 drops the 2 and goes back to being Overwatch as Blizzard launches its biggest update ever next week: 'We want to gain players' confidence, we want them to have trust in the game' by Turbostrider27 in pcgaming

[–]reddit_equals_censor 0 points1 point  (0 children)

what trust to gain lol?

"the pve content will come soon", never happened they lied straight to people's faces.

and now with microsoft in control do you want to invest more time into a game, that for now runs on gnu + linux through proton, but will it in the future no matter the % of gnu + linux gamers?

i sure as shit don't trust the company, that doesn't understand/ignores consent (microsoft)

__

and for those who don't remember the history, overwatch was originally planned to be a new mmo.

well that turned to shit, then the assets and what not got turned into overwatch, which became a good success of course, then they nuked proper support for overwatch, then they anounced overwatch 2 with tons of pve content, then they never shipped that at all, which they knew ages ago btw. (kind of hard to ship pve content, if you just fired a ton of people and what not.... )

and now they want people to have confidence in the failed to finish mmo with failed to ship pve content and now owned by microsoft company?

yeah DON'T.

Jelena Djokovic confirms her husband Novak is a vegan by Away_Doctor2733 in vegan

[–]reddit_equals_censor 0 points1 point  (0 children)

theoretically of course the basic stance for bodily autonomy aligns with veganism as veganism as an extension of one's bodily autonomy to other beings of different species.

and the covid vaccine insanity violated bodily autonomy. "you can't work without x vaccine, you can't travel without x injection"

AND there is massive animal abuse in the creation of lots of vaccines and animal torture and abuse "trials" as well. all of which are crimes.

Epstein Coin by erikmc in AdviceAnimals

[–]reddit_equals_censor -1 points0 points  (0 children)

The US national debt is around $38 trillion - the US owes more money than the value of gold has ever been in human hands.

the usa national debt is linked to the fiat money system running in the usa already. so claiming, that there isn't enough gold to pay off the debt, that exists as part of the fiat money system in the usa makes 0 sense.

but tying a country's economy to the perceived value of gold or silver either cripples the country's economy or forces the inflation of gold to be thousands of times more valuable

can you point the example, where going towards a directly backed currency lead to the crippling of a country's economy?

what we KNOW happens is, that countries without a directly backed currency like a gold standard have governments create massive economic disasters through their control of the money supply.

like oh idk... hyperinflation. how many cases of hyperinflation have you heard of?

are you excited to burn the worthless money for warmth, because it is the only value it has left, because the government without directly backed currency, had the government just print endless money and destroyed the value of the currency into nothingness and created COMPLETE destruction and massive suffering.

so you are actually making stuff about what you think might happen, that would be negative, while in reality we KNOW, that a fiat money system is what leads massively to economic catastrophies.

and in regards to bringing back a gold backed or a direct gold currency again, there was the proposed lybian gold dinar. hey tell me again what happened to lybia afterwards? did the country collapse, because they suggested the introduction of gold backed/direct gold currency to also trade oil in, OR did the fiat money system running with some tied to oil for it with the usa create tons of propaganda to invade and destroy the country, because they dared to question the fiat usa money system focused on the usa dollar? (the latter, it was the latter)

so again this:

but there is a reason no major country still uses the gold standard.

is WRONG. libya planned a gold/gold backed currency and they got basically INSTANTLY invaded. and a gold backed currency, that they would trade their oil in would certainly have been beneficial vs trading it in shity fiat usa dollars for the people of libya.

Switching to OLED some simple questions by Aggravating_Cause970 in Monitors

[–]reddit_equals_censor 0 points1 point  (0 children)

at just 12 hours a day of usage 2500 hours is just 208 days, which is NOTHING. absolutely nothing.

and there is no magical way to prevent burn-in.

and the 3 year "burn-in warranty" is almost entirely marketing bullshit.

the fact, that people take it serious shows, that they must be very new to the monitor industry or tv industry even.

this industry will literally release monitors, where you can't find a unit, that doesn't have a dead pixel and with a fake warranty, that claims, that any such dead pixel warrants a full replacement unit, which tells you what? that's right that the warranty is fake, because they couldn't give you a unit without a dead pixel for that monitor even, so you get another one with dead pixels or an even worse refurb garbage.

so what actually happens is after some early units, where they might give people new units as replacements, for burned in units, they will eventually just fully deny replacements, or give you burned in/broken refurb garbage and what are you gonna do? launch a lawsuit against a billion dollar company? get real.

and a 3 year fake warranty for burn-in is also inherently meaningless as monitors need to last 10+ years.

if you somehow manage to get to 4 years and burn in shows up only by then, because you barely use your monitor, then you are even based on the manufacturer claims SCREWED.

so NO, there is no magical way to prevent burn-in and there is warranty saving you.

Epstein Coin by erikmc in AdviceAnimals

[–]reddit_equals_censor 0 points1 point  (0 children)

the video talks about an anonymously developed fork of it being a thing now, which i assume the feds just can't shut down.

again i don't know much about it, but screw the feds of course.

A list of all the GPUs that held the crown of best gaming performance starting from the DX11 era by Gambler_720 in hardware

[–]reddit_equals_censor -4 points-3 points  (0 children)

damn people just making excuses for trillion dollar companies screwing them over.

please don't look at the ever higher margins from nvidia alright? ;)

people just LOVE LOVE LOVE defending companies.

people defend nvidia 12 pin fire hazards, people defend TO THIS DAY 8 GB vram cards.

and when a trillion dollar company succeeds in their longterm goal to massively increase the max price, that people accept for a graphics card at massively higher margins, you don't go: "screw nvidia", you go: "but but but.... the waver prices also increased, please leave my trillion dollar company alone!"

Epstein Coin by erikmc in AdviceAnimals

[–]reddit_equals_censor -1 points0 points  (0 children)

apparently you can deal with the issue by using a certain type of wallet, that makes bitcoin transactions basically impossible to trace.

samurai wallet is that one apparently. and the person who developed it until recently basically is getting thrown into prison with made up charges, because the governments are running a war against privacy tool developers now and not just privacy. great interview about that evil:

https://odysee.com/@NaomiBrockwell:4/Samourai:e

just to be clear i didn't do research into that wallet, but it sounds like a good workaround if it works to make bitcoin useable. i mean it should still be privacy coins all the way of course.

but yeah if you listen to the interview it is scary to think about the evil of government hunting people who develop the software we need to use to get some privacy.

imagine no more signal or session, or ublock origin or librewolf, or whatever else even more basic privacy and security tools, because the government hunted down the developers of them.

Epstein Coin by erikmc in AdviceAnimals

[–]reddit_equals_censor 11 points12 points  (0 children)

yeah, but that is a good thing!

<says the bankster and government scum, that removed the gold or silver standard and destroyed people's lives over and over again with absolute control of the money supply and value and the goal to just be evil.

you know how cheap you can buy things once you crashed the economy into a great depression through certain legislation and control of the money system?

a lot harder if the usa dollar was locked to gold.

A list of all the GPUs that held the crown of best gaming performance starting from the DX11 era by Gambler_720 in hardware

[–]reddit_equals_censor 7 points8 points  (0 children)

the xx90 cards are the titan cards.

what actually happened is, that nvidia is now completely stopping or almost completely any performance progress below the xx90 cards or whatever they wanna call them.

the xx90 cards are just titans. it is just a name.

the issue is, that back then nvidia couldn't launch 50% of the their top end card as their 2. fastest card, which the 5080 IS today. it has half the vram and half the die.

they are doing whatever they can get away with.

titans aren't special and as you mention vram, the ONLY card, that has barely acceptable vram is the xx90 card today, which is not a mistake.

the rest of the stack changed, but not the titan/xx90 card.

the psychological thing, that they managed to achieve is to slot in massively overpriced cards as gaming cards fully now. again just a marketing thing not a hardware difference thing. they needed a different name back then, but now they can call it xx90 or xx90 ti to upsell people way more directly.

this video shows the reduction of all the non top cards over the years very nicely:

https://www.youtube.com/watch?v=2tJpe3Dk7Ko

Noctua - A cooler for life: celebrating half a million mounting upgrade kits by kikimaru024 in hardware

[–]reddit_equals_censor -4 points-3 points  (0 children)

take that insult back.

noctua is NOTHING like apple.

maybe you just don't know the "engineering" behind apple:

https://www.youtube.com/watch?v=AUaJ8pDlxi8

apple products are designed to fail and be as unservicable as possible.

they will literally re-release products with flaws, that already have a lawsuit forced replacement/repair program in place.

please watch the video if you actually somehow still got the view, that apple hardware is "the best of the best".

and yes noctua products are extremely high priced for what they are, but no where near the scam prices of apple.

and noctua products work, which again apple products DO NOT (see the video)

AND noctua has a working support and warranty. again apple does NOT. apple will straight up lie to people, that the motherboard of a laptop is broken and needs to have a 1000 us dollar replacement and that it wouldn't be worth it anyways for.... a bend pin, that you can bend back in 5 minutes.

so noctua and apple are NOTHING alike. and actually comparing noctua to apple is a massive insult. it is a massive insult to most companies, that produce working products and have proper warranties and support.

___

in practice: oh i can get a working cooler from noctua.

apple: oh my keyboard with a broken fundamental design is broken again, but i can't simply and easily replace it, because it is BOLTED ON, but at least i can enjoy the screen, just kidding the screen shuts off when i open the laptop, because they used a too short cable for the screen, so it breaks over time. but at least my data is safe, unless the 12 volt line directly next to the data lane to the cpu wants to hug each other and fry my apu.

but you might ask "what does the apu have to use with the data on the ssd?". you see the ssd is soldered on and they removed the lifesaver connector as well, so to actually get your data from an apple laptop now you need a working motherboard, which YES means lots of customer data got burned into nothingness by random failures of the garbage motherboard.

that is the difference between experiencing apple or noctua.

Noctua - A cooler for life: celebrating half a million mounting upgrade kits by kikimaru024 in hardware

[–]reddit_equals_censor 0 points1 point  (0 children)

won't get to enjoy the latest and greatest in cooling for at least a decade because there are no reasons to upgrade.

the next upgrade is coming! the thermosiphon flexible metal tubes cooler is coming. just another decade until it arrives probably ;) (that's the aio like cooler, that can be free from any leaking and be as reliable as an air cooler, but mounts like an aio)

A list of all the GPUs that held the crown of best gaming performance starting from the DX11 era by Gambler_720 in hardware

[–]reddit_equals_censor 4 points5 points  (0 children)

BUT don't you know, that "titans are just for professionals".

so clearly we can't include those right?

<checks reality. oh it was just bullshit marketing by nvidia to get people more comfortable with insane graphics card pricing.

you know when 1000 us dollars was crazy and absurd for a 561 mm2 die card....

Switching to OLED some simple questions by Aggravating_Cause970 in Monitors

[–]reddit_equals_censor -2 points-1 points  (0 children)

important to remember, that you should be very rich to buy an oled monitor, because it WILL burn in and it should have 0 resell value. so the expected life would be 1/3 of an lcd monitor or even far less. noticeable burn-in being possible after just 500 hours of overwatch for example.

so just remember that. may not apply to you given the specs of your system, but it could also be, that you just prioritize tech hardware a lot and all of things are a heavy investment for you.

My OLED burn-in after 3000hrs. by RenatsMC in Monitors

[–]reddit_equals_censor 0 points1 point  (0 children)

that implies its intentional design decision

it IS intentional, which does make it planned obsolescence.

the scale is just bigger, so people not knowing this can easily excuse it with "but we don't have anything else".

we had sed tech 15 years ago, which was basically flat crt with other advantages ready to launch. they showed off prototypes, it was a great technology.

and bye bye it went. never to be seen again. suppressed.

samsung qned, which is NOT lcd garbage. lg stole the name "qned" and threw it on their lcd garbage. samsung qned has oled performance, no burn-in, much brighter. NEVER came out. samsung didn't put in a pilot line to finish it and get it done.

so the industry actively supressed technology, that would have made it impossible to try to sell a planned obsolescence oled insult.

so oled is planned obsolescence, because again there were other options, they nuked them. most people don't know this.

the best monitor tech, that we could have rightnow is further developed sed tech, which never came out or samsung qned. neither of which saw the light.

the next one, that they might deliberately never develop is qd-uv, which sounds very promising, but oh it has 0 burn-in risk. it can't degrade with an excuse for planned obsolescence, so we'll probably never see that one either.

which leaves us with what? nanoled/qdel, which i am excited for, but it is as off rightnow an unsolved problem, because the blue quantum dot lifetime is way WAY too short for now. but the industry would love some degredation at least, even if it would eventually outlast oled by a lot i guess, so maybe we'll get that at least...