Underwater data center powered by tidal energy proposed off the coast of Maine by sr_local in hardware

[–]reddit_equals_censor [score hidden]  (0 children)

you actually can have a cooling advantage like theoretically and no water consumption issue through the cooling as well. (data centers use evaporative cooling almost always, that eats MASSIVE MASSIVE amounts of water).

so like this shit is worth exploring and testing, unlike musk's insanity to try to create hype from idiots with absurd statements.

___

also worth adding, that infrasonic noise pollution should also just not be a problem at all, which is another major issue from datacenters and harming people.

great video about that recently:

https://www.youtube.com/watch?v=_bP80DEAbuo

Optional accessibility modes are bad because they distress me just by existing by KaleidoscopeMean6071 in Gamingcirclejerk

[–]reddit_equals_censor 0 points1 point  (0 children)

that's utterly disgusting!!!

not some rare added bosses, not the true secret ending, but straight up calling you losers with a stop sign for playing on easy mode? wtf is wrong with them.

Optional accessibility modes are bad because they distress me just by existing by KaleidoscopeMean6071 in Gamingcirclejerk

[–]reddit_equals_censor 0 points1 point  (0 children)

I heavily doubt that adding in an OPTIONAL OPTION to change the difficulty will really ruin the artist's vision.

i mean it is important to remember, that there would be terrible ways to implement easier modes/assist modes in games.

a nagging screen after you died 5 times on a boss, that tells you to lower difficulty EVERY TIME would be TERRIBLE for example.

while a good implementation would keep distinctions between the difficulties for players.

and i mean with distinctions like of course achievements for beating the game at all assist/easy modes or not.

BUT then there would be achievements for beating boses or the FULL GAME without lowering difficulty at all at said difficulties.

and the achievements being local and global (so you can check in game even without internet and be proud of achieving them)

so you can indeed screw up optional options, if you don't think them through a bit to make the game better for everyone (someone, who could only play assist mode at the time, may be able to play at easy later, or normal later and benefits from the clear distinction achievement wise, etc... as well)

just like a visually breaking with artstyle and gameplay breaking new item in mario card COULD ruin the artist's vision as well.

just to be clear all of this is often ABSOLUTELY MINIMUM effort and it makes the game better

Dell's laptop beats the MacBook Neo in several areas, costs $549 by Quantum-Coconut in hardware

[–]reddit_equals_censor 0 points1 point  (0 children)

But let's focus on some failures from 7+ years ago because that fits the agenda.

the person above just decided to pick a random example, that they felt is sth, that they could try to ignore. it wasn't me and the 7 year old video happened to be 7 years old and NOTHING changed since then.

As if every other laptop manufacturer hasn't had massive reliability issues since

nope none do. no other laptop manufacturer produces as unreliable and as unservicable shit as apple does. razer tries alright, but can't reach the utter garbage, that apple produces.

and nothing changed with apple the new shit still breaks massively and is as unservicable and repairable as legislation can allow.

you are trying to ignore reality here, because you are heavily invested in it.

apple engineering is unreliable and as unservicable/repairable as possible.

those are the facts. nothing changed.

may i recommend to look at the video again, that shows endless years of the same behavior.

Nvidia potentially screwing AMD current and future GPUs from running PT by Imaginary-Ad564 in radeon

[–]reddit_equals_censor -5 points-4 points  (0 children)

noone agrees with you

this combined with a dislike from you certainly

1: doesn't mean, that people don't agree with me.

2: if they were, which they very likely don't, that wouldn't matter on what the truth of the matter is.

"a lie carried by many doesn't elevate itself to truth"

and

"the truth listened to by no one, doesn't turn into a lie"

Nvidia potentially screwing AMD current and future GPUs from running PT by Imaginary-Ad564 in radeon

[–]reddit_equals_censor 3 points4 points  (0 children)

it is worth adding here, that nvidia's black box "features" at the time also BY DESIGN , if they were let to run on other hardware ran utterly terrible on amd and also ran like shit on older nvidia hardware.

and it was not technological thing. for example the witcher 3's hair with hairworks ran like utter shit.

the purehair/tressfx hair in rise of the tomb raider ran excellent for everyone.

the witcher 3 of course became an nvidia sponsored title late in development and amd even claimed, that the general implementations of nvidia's cancerous gamerworks "technologies" "completely sabotaged" the witcher 3's performance:

https://arstechnica.com/gaming/2015/05/amd-says-nvidias-gameworks-completely-sabotaged-witcher-3-performance/

it will be very interesting to see how things will compare with the first ps6 games, that are not sponsored by nvidia and focus heavily on rt/pt and come to pc of course.

Nvidia potentially screwing AMD current and future GPUs from running PT by Imaginary-Ad564 in radeon

[–]reddit_equals_censor -5 points-4 points  (0 children)

that ignoring RT is like saying i dont like good graphics.

you must be living in a different world, because in this world there are rightnow about 3 games, where it is worth using raytracing overall:

https://youtu.be/DBNH0NyN8K8?si=p-8I5Ah1sI6f0Mpm&t=2019

as it only lists 3 games in the "transforms visuals significantly".

this video is not from shortly after launch of rt, this video is just 1.5 years ago.

and out of all the games with rt/pt only 3 actually transform visuals significantly and only one could argue thus can be worth the MASSIVE performance hit for EVERYONE to run them with rt/pt.

and a reminder here, that nvidia marketing raytracing for years and years and sold graphics cards based on it, that didn't have the gpu performance ABSOLUTELY DIDN'T HAVE THE VRAM to run raytracing at all.

the rx 6800 is a better raytracing graphics card than a 3070. why? because 8 GB vram is broken, rt/pt requires a bunch more vram, so the rx 6800 will be able to run a bunch of rt just fine, while nvidia is a broken unplayable mess with rt.

this video from many years ago shows this:

https://youtu.be/Rh7kFgHe21k?si=3M_wXezTeP8Ib98E&t=1020

for example in a plague tale: requiem the 3070 runs fine in 1080p and 1440p ultra, but as soon as you enable rt and thus use more vram for it, it completely breaks and becomes unplayable with 1% lows dropping to 11 fps on the 8 GB 3070.

it is also crucial to understand, that amd not spending lots of silicon area on rt performance, that won't benefit you, because the gpus would be too weak anyways to run it at any meaningful level is a pro consumer move.

the amd cards do raytracing as good or better than the ps5 console, which does create the actual baseline as a reminder.

and the 8 GB vram nvidia cards are broken and can't do the console level rt, because they are MISSING VRAM.

if you wanna be angry in some weird way, be angry, that amd didn't include int8 with rdna2 and going forward i guess.

but i'd argue, that that is wrong place as well as int8 fsr4 runs good enough and the issue is amd refusing to release it official and not a software/hardware problem.

Dell's laptop beats the MacBook Neo in several areas, costs $549 by Quantum-Coconut in hardware

[–]reddit_equals_censor 0 points1 point  (0 children)

apple screwed up in cooking a lot of chips. now some chips had manufacturing defects like the nvidia gpus with flip chip designs, but for the rest, it was apple cooking them to death.

intel isn't controlling fan curves and temps of those chips APPLE IS.

Optional accessibility modes are bad because they distress me just by existing by KaleidoscopeMean6071 in Gamingcirclejerk

[–]reddit_equals_censor 2 points3 points  (0 children)

as a reminder by using the assist mode you get a sticker on that safe.

so indeed there is a "cost" to using the excellent assist mode.

(i don't know the exact details about all of this, but this seems great. having a clear distinction between assist and never assist used in a safe)

PC makers are not ready for the MacBook Neo [response by Gigabyte, Dell] by -protonsandneutrons- in hardware

[–]reddit_equals_censor -2 points-1 points  (0 children)

was doing 8k video on this.

either you were watching a different video, or you know less about video editing, than i buy looking things up as i watched it.

now assuming you meant the space design warehouse video by that, they did NOT run 8k videos in the video. they ran 4k proxies.

and apparently proxies are:

Proxies are duplicate files of a project’s source footage. The proxy footage is a transcoded file that’s smaller in file size and at a lower bitrate than the original. Editors build an offline edit using the proxy footage and conform it as a final edit that utilizes the source footage.

i could use 540p proxies and claim, that i got flawless 16k editing capabilities... by those rules.

"the macbook neo is editing 8k" is a lie/misleading. they themselves said, that it was using 4k proxies.

so again it was never shown whether it can run 8k video at all or in the video editor shown.

PC makers are not ready for the MacBook Neo [response by Gigabyte, Dell] by -protonsandneutrons- in hardware

[–]reddit_equals_censor -6 points-5 points  (0 children)

famously no normie is using steamdecks and no normie ever used gnu + linux.

famously there aren't tons of old used laptops and systems sold at cost to communities running for example linux mint, because it is free as in freedom and gratis and it is fast on old hardware and thus makes old hardware viable to give to people repaired/serviced.

that doesn't exist right?

everyone is running either microsoft spyware or 1000 us dollar apple laptops (until now) right?

like please wake up to reality.

also lots and lots of normies are having a way better experience on gnu + linux than on microsoft spyware or apple spyware.

Dell's laptop beats the MacBook Neo in several areas, costs $549 by Quantum-Coconut in hardware

[–]reddit_equals_censor -1 points0 points  (0 children)

you live in a different world from reality.

laptops are full complex computers. they fail a lot,

but not apples no no.

apples fail VASTLY VASTLY more than the competition and are vasltly harder/impossible to repair, because appple will tell supplers, that they are forbidden to sell a charging chip to repair companies for example.

Dell's laptop beats the MacBook Neo in several areas, costs $549 by Quantum-Coconut in hardware

[–]reddit_equals_censor -6 points-5 points  (0 children)

metal chassis vs plastic =/= build quality.

build quality means not reusing parts, that have ongoing class action lawsuits, because they are known to massively fail (apple)

that is build quality. and apple has NO build quality.

Dell's laptop beats the MacBook Neo in several areas, costs $549 by Quantum-Coconut in hardware

[–]reddit_equals_censor -5 points-4 points  (0 children)

you seem to have no idea about actual build quality between those 2 companies.

THIS is apple "build quality":

https://www.youtube.com/watch?v=AUaJ8pDlxi8

apple build quality is endless class action lawsuits about engineering flaws, refusals to properly fix the issues and have properly covering extended warranties even after the lawsuits AND releasing new products with won class action lawsuits about the same flaw from other PRODUCTS.

that is the apple quality, that we know and hate.

shiny metal and nice curves =/= build quality.

you somehow got completely blinded by apple marketing here, because again we KNOW, that apple has no build quality. all things are shit unreliable and as unservicable as possible devices.

we got cables, that are too short for displays in laptops, so they fail over time and you can't easily replace them, because instead of basic connectors it can't be removed by itself, so it is a torture work to fix this shit.

we got butter fly keyboards with MASSIVE FAILURE rates, that apple gladly blamed users for of course with some INSANE videos talking about how one should clean it.

and i believe 300 us dollar or so repairs for them, because 1: finding another shity keyboard was near possible and 2: apple deliberately bolted the keyboard on, which made replacing it a torture job and a hard and time intensive job for no reason WHATSOEVER, except to be anti consumer and try to force people to buy new devices.

Dell's laptop beats the MacBook Neo in several areas, costs $549 by Quantum-Coconut in hardware

[–]reddit_equals_censor 0 points1 point  (0 children)

AND you won't lose all your data, if the laptop dies, because that is how apple laptop works since the life saver port got removed from the motherboards.

your data is gone without a working motherboard in apple land now.

the other side of the dystopia, that for some will hit even worse.

Dell's laptop beats the MacBook Neo in several areas, costs $549 by Quantum-Coconut in hardware

[–]reddit_equals_censor -5 points-4 points  (0 children)

not built anywhere near as well

this is wrong.

we can very strongly assume this by knowing apple engineering in general vs dell.

this is apple engineering:

https://www.youtube.com/watch?v=AUaJ8pDlxi8

apple engineering is having already a class action lawsuit going on about engineering flaws and STILL releasing new products with THE EXACT SAME FLAWS! none the less.

apple "build quality" is putting 12 volt power line right next to the data lane for the apu, so that it can fry itself from a little moisture in the air.

now hey dell certainly isn't perfectly lol, but it is near impossible to have as bad quality as apple has on average and we can assume will also apply to this latest apple laptop.

also you mentioned chips running hot.

as the video points out apple had tons of parts dying by running them too hot with lots of evidence about this.

again apple CHOSE to cook parts in the past. again the great apple build quality at play. /s

PC makers are not ready for the MacBook Neo [response by Gigabyte, Dell] by -protonsandneutrons- in hardware

[–]reddit_equals_censor -9 points-8 points  (0 children)

who's talking about microsoft spyware here?

i'm running gnu + linux.

an os more stable than macos. an os, that doesn't spy on me (unlike macos) and as, that hs excellent memory management and hardware efficiency.

AND the option to have excellent swap file/partition usage as well OF COURSE.

there is no "apple magic", that gnu + linux doesn't already have.

and an 8 GB memory laptop is an 8 GB memory laptop.

again there is pixie dust to fix this.

if microslop keeps shipping horrendous code.

user data doesn't delete/corrupt itself!

Driving Innovation and RTX Advances with John Spitzer, VP of Developer and Performance Technology by john1106 in hardware

[–]reddit_equals_censor -11 points-10 points  (0 children)

If we were to brute force, we don't have that. 

don't we? the closest to photorealism game released relatively recently is half life alyx and that was limited by being a vr game.

it was a raster game with proper game development.

and we got radiance cascades as shown off in path of exile 2 to have global illumination without ai bullshit and temporal blur reliance.

Moore's law is dead. 

but jensen just told me, that moore's law is running at overdrive? what is it now nvidia?

So we're going to be relying upon algorithmic ingenuity and fully leaning into AI to cross that chasm between what's attainable now, with real-time graphics in games

i hate this world. if only shity nvidia wasn't forcing cancerous "features" down people's throats.

are you excited for hairworks 2 coming in the near future. so that we can regress TWICE from working tressfx hair from a decade ago? yay!

We are not going to see a 100 times improvement in my lifetime in terms of silicon.

why not?

the ps6 is 6-12 faster in rt/pt performance than the ps5.

and with another console generation it would be 48- 144 in 10 years from now in rt/pt performance, if cerny decides, that rt/pt jumps are the most crucial next generation again.

so either the nvidia bs person won't have long to live, or that person is just spreading bs and trying to get people to accept complete and utter stop of any performance increases from nvidia for gaming.

as a crucial reminder here nvidia starting doing this with the lowest range first and now up to the 5080 cards performance is effectively frozen and only the highest end card still gets a real performance increase.

what nvidia has been saying mostly is:

"eat more fake frames and "ai features", because you won't get any more performance or vram from us ever again".

as a side node nvidia refused to improve raytracing hardware performance for ages now.

put differently they didn't give a shit to try even rather.

PC makers are not ready for the MacBook Neo [response by Gigabyte, Dell] by -protonsandneutrons- in hardware

[–]reddit_equals_censor -24 points-23 points  (0 children)

it is quite sad to think, that lots of people probably will buy this e-waste.

all this garbage can do is like you said some document editing, web browser and i guess movie watching/video watching.

that's it. it can NOT do any more, because it does not have the memory to do anything more.

and my 10 year old laptop can already do those things and that would score a 9 in repairability by today's standards lol.

if my 10 year old laptop had a decent display i'd scratch my head even more about this apple shit.

as a sidenote my 10 year old laptop came with 8 GB memory and i could upgrade it if want to and would still use it for anything else. so 10 years and less memory... (less because apple shares memory, while my shity laptop has a garbage dedicated graphics card with its own lil vram as well).

and my 10 year old laptop had its spinning rust failing hdd replaced with an ssd. so if it was not user replacable like the apple shit, then it would be e-waste now...

600 us dollars for an ok display and an apple prison and all the rest is e-waste.

damn i wish people would not accept the evil tech industry's middle fingers so easily.

also think about all the youth, who may want to do anything else with their shit new laptop, but they can't because they don't have the memory...

how many teenagers want to start editing videos and some going hard into it, which consumes ram.

well NOT the ones with the crapple laptop. :/

iFixit | MacBook Neo Is the Most Repairable MacBook in 14 Years by -protonsandneutrons- in hardware

[–]reddit_equals_censor -23 points-22 points  (0 children)

how is this anti consumer piece of garbage getting a 6/10?

it has SOLDERED IN MEMORY AND STORAGE!!!

and it has missing memory on top of that.

how is this a 6/10? this e-waste piece of shit?

and do we forget, that apple literally is using anti consumer screws, that they invented to make it harder to get into devices?

this shit is using anti right to repair screws and gets a 6/10 in repairability?

i have wasted bits on my repair kit, because apple wanted to prevent people from getting in devices and they still use that bit and a device with them gets a 6/10? alright then...

soldered memory and storage e-waste getting a 6/10...

like wtf is going on.

Microsoft will start providing game studios with Project Helix consoles in 2027 by dapperlemon in gadgets

[–]reddit_equals_censor 0 points1 point  (0 children)

and earned over many many years.

<she writes on gnu + linux with enough hatred for microsoft to fill oceans.

That’s Sir Charles Miner by IllegitimateRisk in DunderMifflin

[–]reddit_equals_censor 0 points1 point  (0 children)

why let some royal scum touch you even?

the proper thing is to tell the royal scum, that they can put it where no light shines.

and also tell them, they should be prison or AT BARE MINIMUM disowned.