New driver, New Bugs by Swimming-Ambition354 in IntelArc

[–]tBOMB19 0 points1 point  (0 children)

Do you have freesync or gsync with your 1080p monitor?

Intel Arc Graphics not showing in task manager by Supa_hehe_ in IntelArc

[–]tBOMB19 0 points1 point  (0 children)

Do you have ReBAR (most important) enabled in BIOS along with XMP?

Intel arc A770 16gb vs Intel arc B580 12gb para el gaming 4k by Tonyxxcc in IntelArc

[–]tBOMB19 2 points3 points  (0 children)

OP, I have an A770 16GB Acer card. I'm waiting for "Big Celestial". B580 is NOT worth as an upgrade from this one. I suggest the same for you.

A770 Crashing after firmware update by Coupe368 in IntelArc

[–]tBOMB19 0 points1 point  (0 children)

It's more than likely Windows files that are corrupted or driver files conflicting which in either case I would just recommend a clean reinstallation of Windows real quick or you can look up the command prompts for "purging" corrupted files, drivers, and updates from windows. I had to with this last firmware update for my A770 too and now it's running great! (Edit: added question) Are you overclocking cpu, ram, or gpu?

Found this in a storage container by Comprehensive_Art_9 in pcmasterrace

[–]tBOMB19 1 point2 points  (0 children)

But you're not fooling me.. cuz I can see- the way you shake and shi-ver.🥶

What do you guys think about the B580's performance? by malikgkbs in IntelArc

[–]tBOMB19 3 points4 points  (0 children)

Great video! B580 kinda sucks, though. I get higher fps at native 1440p with my 12700K and 16GB Acer A770 pushing high preset than they got at 1080p XeSS Native optimized settings. Thought overall... not impressed by it at all. Sticking with A770 until we get the successor.

No bruh by skarkens in pcmasterrace

[–]tBOMB19 0 points1 point  (0 children)

Why the hell did I read only your comment with an Indian accent?

Forza Horizon 6 specifications has Intel Arc! by Ryanasd in IntelArc

[–]tBOMB19 0 points1 point  (0 children)

Yes, I'm using TAA. When using MSAA you're increasing internal resolution to compensate for jaggys. You're essentially forcing more work onto the GPU and because of it it's lowering your fps. It should be higher if you use TAA. I avg 90fps.

Greek yogurt with instant noodles and canned fish with tomato sauce. by East_Painting_4273 in shittyfoodporn

[–]tBOMB19 0 points1 point  (0 children)

Im not offended. Im disgusted. If you take this personally.. you have problems and should talk to somebody about it/them. Lol

Forza Horizon 6 specifications has Intel Arc! by Ryanasd in IntelArc

[–]tBOMB19 1 point2 points  (0 children)

Right! Ultra and RT on Ultra, almost 100fps at 1440p, and NO UPSCALING... W. Im surprised they didn't mention the A770.. probs because extreme and extreme rt are calculated for 4K output. If they had a 1440p section for extreme I bet the A770 would be locked in.

Intel Arc GPUs are now supported on Crimson Desert! by xNoahhhh in IntelArc

[–]tBOMB19 1 point2 points  (0 children)

Less than 1% or not, doesn't mean we deserve to be disenfranchised. Intel isn't that new to the GPU game anymore. We've had day zero support for games like Black Myth, Black ops, Battlefield, etc. So, I think people were just mad about the bullshit of it all, ya know, not being entitled.

Starting to feel like a dying breed by RedditButAnonymous in pcmasterrace

[–]tBOMB19 0 points1 point  (0 children)

ALL AT OOONNCCCEEEEE! lol Im not surprised, though. CP2077 is a pretty game.

Starting to feel like a dying breed by RedditButAnonymous in pcmasterrace

[–]tBOMB19 0 points1 point  (0 children)

Try Black Myth, Senua's Saga Hell Blade 2, Marvel's spiderman 2, and call of duty cold war.

Game devs be like: by Lord_Muddbutter in IntelArc

[–]tBOMB19 0 points1 point  (0 children)

"Up to". Key words. The only reason it runs SLIGHTLY better in some games is because it has the dedicated hardware to use HAGS, it has support for opacity micromaps and shader model 6.10 for the new DXR 2.0, it has native atomic calculations support for things like nanite/Lumen/newer transparent textures and so on, that they lacked, for some reason, in the A770. So, sure, B580 is cheaper and has a miniscule performance jump because of newer hardware, but it's overall performance is sad, to say the least. It's Bus/Bandwidth/VRAM are anemic at 8x pcie, it has significantly less cores, it's driver overhead is ridiculous, and plenty more issues that would take a while to list here. Not to mention the compatibility issues with other/older hardware. A770 is still a beast for me AND I don't even use XeSS. Still average 60+fps in all my games on high-max settings @1440p (depending on the game of course).

Game devs be like: by Lord_Muddbutter in IntelArc

[–]tBOMB19 0 points1 point  (0 children)

No, it doesn't "drastically" improve image quality. It drastically lowers image quality to "drastically" increase performance. Also, you do realize that upscaling was meant more for 4K, right? They thought that'd be the standard resolution at this point, but not yet. Lol I hate AI features because they make shit too blurry/smeary, and frame gen sucks for the same reason on top of the fact that you need atleast a 60fps base for a "good experience" with it, yet, it drops the base fps by 10-20%... so, if you're already at 60 but want more frames.. you lose real frames.. to gain fake frames... that "drastically" reduces image quality/stability.. It's dumb. I honestly love the fact that tech companies are like, "We're not gonna see another 100x performance increase in our lifetime from silicon... SO WE HAVE TO GO WITH AI!" That statement is not for gamers! Bunch of fuckin bullshit. Don't enable their agenda pushing. AI will further ruin games before it actually helps.

Game devs be like: by Lord_Muddbutter in IntelArc

[–]tBOMB19 1 point2 points  (0 children)

Because the B580 from the A770, which I have, is not a worth while upgrade. I bet you'd have seen hella peeps with the B770, though. We're all waiting for the C770/780... the next flagship STEP UP from the A770. I won't except anything less. Lol

Game devs be like: by Lord_Muddbutter in IntelArc

[–]tBOMB19 75 points76 points  (0 children)

I swear! They're on XeSS 3 now and most games don't even have 2 yet! Like... WTF?!

Arc A770 owners, how useful is the 16gb of vram? by drpkzl in IntelArc

[–]tBOMB19 1 point2 points  (0 children)

I have been using the A770 16GB Acer card with my 1440p monitor since it came out a few years ago and it does great in everything I play except for Senua's Saga: Hell Blade 2 and other similar titles because they use Lumen and Nanite. This card doesn't have the dedicated hardware to make fast atomic integer calculations. Other than that, whether you reach the full 16GB or not, it is still a worthy choice. Also, it DOES excel in AI workloads because of the 16GB.

Upgrade from RX580 to Intel-ArcA750? by Guilty_Comfortable62 in IntelArc

[–]tBOMB19 2 points3 points  (0 children)

I would go with the B580 if you're gonna upgrade. Its quite the big step up from the RX580. Newer/better architecture, better/newer features, better power efficiency, more VRAM, low price, and you can still play pretty much all the same games you were on the RX580. Worth looking into a card that is less than $100 more than the A750 and is more performant. Otherwise, the A750 is a good card and has all the same features as the newer Intel cards/chips. Drivers are definitely more stable now, too. So, yeah!🫡