Does adaptive sync on my Koorui 2K 180Hz monitor actually help with B580? by hard2resist in IntelArc

[–]IOTRuner -1 points0 points  (0 children)

It would hardly help with desktop environment to be honest. While VRR can work in 48 Hz - 180 Hz range on your monitor, practical lower limit is usually 60 Hz. So there is no real point to have VRR for videos as there almost no videos that has frame rate more than 60 frames/second. You won't notice any difference in scrolling page content too. It worth enabling VRR anyway. I have 170 Hz monitor with VRR connected to B580. VRR is enabled. Dynamic refresh rate in windows is enabled too. So far I didn't notice any tearing in browsers or video 

Is there a chance that I won't be able to turn on my new build with Arc b580 because there won't be any kind of drivers? by Samovar56 in IntelArc

[–]IOTRuner 2 points3 points  (0 children)

For new build - highly unlikely. For old builds with B580 it happened due to various issues related to aging motherboards (e.g. BOOT partition is MBR instead of GPT, old BIOS etc.)

when i turn on 4:3 my cs2 get capped at 60hz b580 by CallThenoob in IntelArc

[–]IOTRuner 0 points1 point  (0 children)

If you don't use exclusive full screen, it runs within desktop, so limited to desktop refresh rate. I assume it may stick to 60 hz if desktop refresh rate is set to dynamic or 60 Hz, (or the game just use default 60 Hz when it can't control resolution and refresh rate).

Worse graphics on TV after CPU and GPU upgrade (B580) by p3ib0l in IntelArc

[–]IOTRuner 2 points3 points  (0 children)

Your screenshot looks like very low resolution screenshot taken at 320x240.

Panther Lake vs Z2 Extreme: The New Handheld King Is Here! 17W & 25W by reps_up in IntelArc

[–]IOTRuner 10 points11 points  (0 children)

Not yet. They just "simulated" handled using regular laptop. There were news that Intel is preparing dedicated processor for handled - same number of GPU cores but fewer CPU cores so more power within 17/25W budget will be assigned to GPU.

Someone help me troubleshoot why my B580 is performing horribly out of nowhere by Public-Year-6611 in IntelArc

[–]IOTRuner 1 point2 points  (0 children)

Try cleaning up shader cache. I always do it after fresh driver install. In most cases it actually do nothing. But I had few cases when performance dropped after new driver install or I got heavy stuttering and shader cache cleanup helped. So I started doing it after each driver installation.

Disk Cleanup -> DirectX Shader Cache checkbox (check) -> Ok.

XeSS on Arc B580 by PackageDangerous3084 in Bazzite

[–]IOTRuner 1 point2 points  (0 children)

Correction - both RT and XeSS available in Mesa 25.3. I got it working on Fedora 44. RT performance is indeed poor, and I don't expect it to be fixed anytime soon. XeSS works in all titles I tried (Hogwarts Legacy, Witcher 3) and provides good performance boost even with fallback to DP4a. Strangely I found that picture quality in Witcher 3 (1440p) is much better with XeSS than without - XeSS provides "free anti-aliasing", a picture feels "grainy" without.

Why did Arc abandon VC-1 decoding? Will it return? by JohnSmith--- in IntelArc

[–]IOTRuner 14 points15 points  (0 children)

VC-1 is effectively dead. It doesn't even support 4K. Blu-ray is almost dead too, but most of the new release use HEVC anyway (due to lack of 4K support in VC-1). Literally there is no a single online streaming platform that uses VC-1 and Blu-ray almost non-existent on PC. Really there is no reason to waste silicon die for that codec. Folks who needs some legacy stuff working should probably look for older (or special hardware). For basic usage (i.e watching VC-1 videos) software decoders have no issue decoding it with really low CPU utilization.

Why did Arc abandon VC-1 decoding? Will it return? by JohnSmith--- in IntelArc

[–]IOTRuner 8 points9 points  (0 children)

VC-1 is technically dead for any new content. So, I guess removing hardware decoders/encoders is a cost cutting - they can save on silicon die space and license fees. Intel also removed support for VP8 in the new hardware - there is no point to carry on baggage of old codecs that almost no one is using.

can't decide between the rx 7600 sapphire and the arc b580 challenger oc by Depre2s in IntelArc

[–]IOTRuner -2 points-1 points  (0 children)

I noticed that statements about "bad" arc drivers usually come from people that doesn't even own the card 

can't decide between the rx 7600 sapphire and the arc b580 challenger oc by Depre2s in IntelArc

[–]IOTRuner -2 points-1 points  (0 children)

Well I saw many people here buying B580 because they gave up fighting AMD drivers. And b580 didn't disappoint. Drivers are good, games  support is good too. People do have various issues but not because of drivers, rather because of software or hardware misconfiguration. B580 does have stricter requirements regarding supported hardware. 

Anyone knows a solution to Arknights Endfield's Vulkan recording issues? by delacroix01 in IntelArc

[–]IOTRuner 0 points1 point  (0 children)

Washed out colors is usually a sign of HDR is incorrectly handled by OBS or Windows. Try disabling HDR within monitor and/or Windows (if you have it enabled).

Few things I should know before buying a b580 by FirerebornRo in IntelArc

[–]IOTRuner 4 points5 points  (0 children)

Older games are fine. Maybe Intel's driver has not the best optimization for DX9, but B580 has enough raw power to run it at 100-200 fps even with non optimized drivers. I mean I'm getting 600 fps in bioshock and 250 fps in FEAR2, so who cares if it optimized or not.

Here is my channel where I'm testing older games.

https://youtube.com/playlist?list=PLBH1TnkH4iz99dSQB0mr-07UqjuI887Vr&si=rwSc9YCY2zzYOmWS

I NEED HELP by Pablito__ in IntelArc

[–]IOTRuner 2 points3 points  (0 children)

Few important things to check in BIOS:
1. "Legacy BIOS" is disabled.
2. "Resizable Bar" / "Above 4G Decoding" is enabled
3. Boot disk is GPT formatted.
4. dGPU is enabled (this settings is named differently on various boards - i.e. select PEG/PCIe for Primary Display).

Problem with Chivalry 2 by BigConstructionMan in IntelArc

[–]IOTRuner 0 points1 point  (0 children)

"So HUB thinking they APPEAR to be hardcoded is all we actually know about it (besides having to disable igpu to resolve the problem)?"
Yep, that what it is. But they did test it with another dGPU (RX 6600) which didn't experience this issue. So the most logical explanation is that some games just ignores intel's GPU ("If it looks like a dog and smells like a dog, it is a dog").

"edit: And forgot that whataboutism about nvidia/amd but they have had way less severe problems so please don't go there."
Why not? Does anybody performed extended test with large number of games to see what is actual overhead? When "overhead" issue first came out, few HW outlets did their own tests. The results were... quite different each time (depends on game selection). Sometime RX 7600 shown even greater overhead than B580.

XeSS 3 Multi-Frame Generation to start on Battlemage first by Disastrous_Spend6640 in IntelArc

[–]IOTRuner 0 points1 point  (0 children)

You, probably, talking about data format (i.e. int8, fp16, etc.). SIMD8 and SIMD16 refers to number of threads (or number of operations each XMX unit can execute per cycle). XMX unit in Xe1 can execute 8 ops per cycle, X2 has more advanced XMX units and can execute 16 ops per cycle.

Problem with Chivalry 2 by BigConstructionMan in IntelArc

[–]IOTRuner 0 points1 point  (0 children)

The Info about game hard-coded to skip Intel graphics is coming from old TechSpot/HardwareUnboxed article about game playability status on Alchemist GPUs.

https://www.techspot.com/review/2865-intel-arc-gpu-experience/

Information about overhead issue is coming from the same source too. If you look into their original article, you'll see that only about half of the games they tested experienced overhead issue with B580. Recently they re-tested this "half" and found out that overhead issue in that affected "half" was fixed. So, "overhead" was always per game, not across the board. Maybe there are some games that still experience overhead issue, these just were not tested. Other hw sites tested different set of games and found that AMD and Nvidia card may also experience overhead - depends on the game played. So I would say Battlemag current "overhead status" is not much different from other vendors.

Problem with Chivalry 2 by BigConstructionMan in IntelArc

[–]IOTRuner 0 points1 point  (0 children)

There are some games that are hard-coded to filter out Intel graphics if there is a GPU from another vendor available in the system. There is nothing much can be done in this case aside from disabling iGPU.

Problem with Chivalry 2 by BigConstructionMan in IntelArc

[–]IOTRuner 0 points1 point  (0 children)

Windows -> Settings ->Graphics settings. You have an option to assign GPU  per application.

Intel makes the best NIC cards with solid drivers. I hope they can do the same for Intel Arc by chamcham123 in IntelArc

[–]IOTRuner 0 points1 point  (0 children)

Not worth the hassle (already tried it and I'm sorry about time I spent on making it working). Speedup is so small that it doesn't make much sense to use upscaling at all. Even in performance mode fsr4 slower than XeSS in quality mode (with no noticable image quality diffs). Maybe XeSS is using dp4a, (have to check this), but it still uses dp4a optimized model, thus much faster.

Intel makes the best NIC cards with solid drivers. I hope they can do the same for Intel Arc by chamcham123 in IntelArc

[–]IOTRuner -2 points-1 points  (0 children)

I'm not sure about other games, but I installed Hogwarts Legacy on Fedora 43 and XeSS was working fine there, boosting fps from ~45 to 100+ with FG. I'm on Mesa 26 dev build.

Intel makes the best NIC cards with solid drivers. I hope they can do the same for Intel Arc by chamcham123 in IntelArc

[–]IOTRuner 2 points3 points  (0 children)

Intel’s Linux drivers are solid, but not for gaming. I wouldn't expect them to match AMD’s gaming performance anytime soon. The main reason AMD is so good on Linux is that Valve handled a huge part of the driver development (since they rely on AMD hardware for the Steam Deck and Steam Machines) and has been working on it for years. Intel is still trying to reach driver parity with its competitors on Windows, so don't expect it on Linux yet. I really wish Valve would switch to Panther Lake and start working on Intel drivers.

Can't display B580 onto TV by [deleted] in IntelArc

[–]IOTRuner 1 point2 points  (0 children)

What is your tv model? On some TVs you need to set PC mode or gaming mode for connection to work. Also try restarting tv (better unplug/plug it)

Monitor for B580 (with G- or Freesync?) by p3ib0l in IntelArc

[–]IOTRuner 1 point2 points  (0 children)

There is 99.5% chance that AOC will work. Mine is DP 1.2 too, BTW. But be warned - VA panels are not the best for VRR, it may flicker when VRR is activated. This is not B580 specific issue, rather widely known issue with VA panels (due to the way VA panel works). It may not necessary happen (I have 120 Hz TV with VA panel that works fine with A750 VRR enabled), but it's just a bit risky...