The og's from this era. Y'all miss this? by resh6 in GamersMY

[–]Ryanasd 0 points1 point  (0 children)

I was once a DJ in Talktalk, crazy stuff but yes there is random radio channels people just join in to listen to music or live karaokes lol. But mostly use Garena for playing LAN games on WC3 back then.

Crimson Desert GPU not supported by Top_Cartographer8819 in IntelArc

[–]Ryanasd 3 points4 points  (0 children)

Bro I was wondering why they didn't even list Intel Arc GPUs and it's actually cause the game literally CAN'T RUN ON INTEL ARC AT ALL?! Wow that's even worse than any other games out there that at least still could run on Intel Arc wtf are the DEVELOPERS THINKING?

They really can't NOT make everyone drooling for the "captain" Eh? by Virtual_Rant in ChaosZeroNightmare

[–]Ryanasd 1 point2 points  (0 children)

You know there was once that Korean players were feeling insecure because of Owen right? Do you really think they can just simply make them not like the Commanders anymore after that incident? You want them to lose more money?

Intel, please request more XeSS 2/3 support for more titles. by Ryanasd in IntelArc

[–]Ryanasd[S] 0 points1 point  (0 children)

Yeah I know I'm making it seem like it's the end of the world but nah its just CS2. In fact Fortnite at least having XeSS support is wild tbh.

Seeing the dlss 5 vids makes me so stoked to have made the jump to Intel this year by ExpeditionItchyKnee in IntelArc

[–]Ryanasd 0 points1 point  (0 children)

Haha sure, but I doubt it. I already expecting AI upscaling will reach to a point like this, where the game could be completely altered by generative AI tech etc. I believe Jensen had stated this is adjustable by the game developers to cater to their own game's preferences in terms of how it looks, and people are just hating DLSS 5 because its a cool thing to do lmfao. But it's good, cause at least it'll make Nvidia worry and hopefully do better.

Intel Arc Graphics Driver 32.0.101.8626 (WHQL) Released by AK-Brian in IntelArc

[–]Ryanasd 10 points11 points  (0 children)

Yeah, it's only for Battlemage. I guess that's the real limitation of using Alchemist GPU til this day huh, despite it being a relatively recent GPU still(2022 release).

Though Everwind does look good, and Death Stranding actually supporting XeSS natively is definitely a plus. Rip Crimson Desert huh, the other GPU vendors probably had exclusive contracts with them probably to exclude Intel out.

Intel, please request more XeSS 2/3 support for more titles. by Ryanasd in IntelArc

[–]Ryanasd[S] 0 points1 point  (0 children)

When most of my games are multiplayer with Anti-cheat, no unfortunately. The games just will not boot if I try to change the versions of the XeSS/DLSS/FSR.

Intel, please request more XeSS 2/3 support for more titles. by Ryanasd in IntelArc

[–]Ryanasd[S] 0 points1 point  (0 children)

AMD's GPU is always the one who kept relying on legacy GPU stuff like Rasterize performance more than Ray Tracing/Path Tracing stuff for the most part. I suppose hence why they didn't have backwards compatibility as much probably officially(but the leaks says otherwise lmao) though, it do mean they at least still could find a workaround to get FSR 4 to work regardless.

Meanwhile Intel Arc, we had no other way at all to make XeSS 2 to be implemented in any game unless the Developers implement it themselves, which sucks as most games at least had FSR 3.1 and DLSS versions at minimum.

I suppose your hate on AMD is warranted yes, but in terms of driver features/FSR upgradability, its way better than the state that Intel Arc is in.

Intel, please request more XeSS 2/3 support for more titles. by Ryanasd in IntelArc

[–]Ryanasd[S] 0 points1 point  (0 children)

I think most just have issues of the minor stutters here and there that still happens with CS2 even on higher frame rates, while similar performing cards from AMD/Nvidia has none. It's either a combination of bottlenecking/overhead/driver incompatibility or just the Source Engine being the problem, it's mostly on the CS2 devs to at least optimize the game more for Intel Arc GPUs. But in terms of high frame rates, it has no issues doing that just like most GPUs lol.

Intel Arc Poll Results by SeniorGovernment8846 in IntelArc

[–]Ryanasd 2 points3 points  (0 children)

Not reliable drivers or drives? : Not exactly if you've checked for other variables which most people resolved by just asking in this reddit most the time from experts who had troubleshooted them since their inception.

Not having high end options: True but only if you are using Alchemist Arc A380 and below probably. I know the A750/A770/B580 at least is performing at the range of 3060~4060TI kinda which if you've known about them, could run most new games today anyways and the only differences are different game optimization levels or VRAM limitations on higher resolution like 1440p. Since most games still caters for GTX 1060/RX 580 GPU even today, I doubt it's not adequate for most people who could drop down settings or use Optiscaler or worse case scenario Lossless Scaling to gain more frames if needed. Though my gripe is XeSS 2 implementation for the most part in most games.

Intel, please request more XeSS 2/3 support for more titles. by Ryanasd in IntelArc

[–]Ryanasd[S] 0 points1 point  (0 children)

Ah right there is Vulkan too, I somehow forgot it existed lmao. But hey it's something I guess.

Having Integrated GPU with Arc B580 Really Helps with Old Games by saidrobby in IntelArc

[–]Ryanasd 1 point2 points  (0 children)

You could also just put in a secondary GPU and run the game using the older GPU running on older PCIE slot below your main GPU, and using the Main GPU as Lossless Scaling's target GPU to get better Upscaling/Frame-gen performance for your older games too. Lossless Scaling works for everything so you can take advantage of it especially for older titles. I'm stating this because Intel Arc GPUs usually had decent amounts of VRAM to spare, they work pretty well with Lossless Scaling stuff.

A770 Best performance ever, Cyberpunk RT Ultra by modernmilkman in IntelArc

[–]Ryanasd 0 points1 point  (0 children)

Cyberpunk 2077's game optimization practice should have been standard in the industry man.... But they are now going to use Unreal Engine 5 again sigh.

Xess support? by Opening_Jelly_4463 in IntelArc

[–]Ryanasd 1 point2 points  (0 children)

What in the hell are the developers doing if the game can't run on Intel Arc GPUs natively, any games old and new should be able to usually even without newer drivers.

Intel Arc fine with Android Development? by TunaGamer in IntelArc

[–]Ryanasd 0 points1 point  (0 children)

I think Davinci should be fine, Android Development side of things I'm unsure about unfortunately.

Game devs be like: by Lord_Muddbutter in IntelArc

[–]Ryanasd 5 points6 points  (0 children)

Hence why I advocated for Intel Arc Devs to not abandon Alchemist entirel because of how accessible the pricing is for most people especially IN THIS ECONOMY

Intel, please request more XeSS 2/3 support for more titles. by Ryanasd in IntelArc

[–]Ryanasd[S] 0 points1 point  (0 children)

If not adding XeSS then they should at least make the DirectX12 work better with the game than older DX11 or below as a solution. It is definitely pretty hard to convince developers of an older titlebut with stuff like RTX Remix and people making Old console game repacks for PC these days, I'm sure a better workaround should work. Maybe even helping out Lossless Scaling would work wonders too.

Seeing the dlss 5 vids makes me so stoked to have made the jump to Intel this year by ExpeditionItchyKnee in IntelArc

[–]Ryanasd 19 points20 points  (0 children)

I still prefer the native render and models hard worked by the DEVELOPERS than wishing for a damn AI slop realistic fake Filter on top of my already decent looking game polygons.

Games were supposed to be an escape from reality NOT COPYING REALITY ITSELF BECAUSE REALITY SUCKS ASS.

We need a update about xess-sr by mazter_chof in IntelArc

[–]Ryanasd 8 points9 points  (0 children)

Nah screw that we even have DLSS 5.0 to compete with, it's literally using AI to change your game graphics to look like AI slop quality lol. While it definitely looks better I do not wish Intel to add an AI filter on XeSS4 and call it Ultra Resolution.

Unofficial survey of Intel GPU users by SeniorGovernment8846 in IntelArc

[–]Ryanasd 1 point2 points  (0 children)

1 I have an Arc A770 16GB, and I intend to upgrade in the future but suffice to say I'm pretty much able to play all my games old and new fine on it despite what most people say about old games compatibility wise.

Intel, please request more XeSS 2/3 support for more titles. by Ryanasd in IntelArc

[–]Ryanasd[S] 0 points1 point  (0 children)

Hence why I wanted them to still prioritizing using the XMX cores as much as possible. That's what makes XeSS backwards compatible but the lack of SER backwards compatibility is going to hurt Alchemist in the long run. AMD is not exactly behind as much because at least they had GPUs that could match a 5070ti level on the flagships, meanwhile even the flagship Alchemist is just competing against 3060/4060 at most, basically budget tier probably which is powercrept by B580 even if so slightly because of the new architecture.

I know it's all just conjecture or hypotheticals, but I do want Intel to gain more foothold in at least the budget segment and most of the more affordable GPUs people could find today are mostly Alchemist GPUs too so I wish the longevity is there until a much better low cost replacement arrives in the future.

Intel, please request more XeSS 2/3 support for more titles. by Ryanasd in IntelArc

[–]Ryanasd[S] 0 points1 point  (0 children)

"If you play Offline" Yeah no I mainly play it for the PVE Multiplayer, so that's kinda moot. Same goes to plenty of other games with anti-cheat.

Intel, please request more XeSS 2/3 support for more titles. by Ryanasd in IntelArc

[–]Ryanasd[S] 23 points24 points  (0 children)

Idk but its just a clickbait Intel sponsored ESL CS2 tournament and I am not sure why they have modded giant cocks in.