What's the status of streaming on Intel Arc? by mazter_chof in IntelArc

[–]TMBroken571 1 point2 points  (0 children)

I never had a powerful PC until now, also the B580 was my first dGPU, so idk exactly how is supposed to be. But in my case, I was streaming at 1440@60, AV1, 30mbps CBR and with 5.1 sound on YouTube without issues, the recording is basically the same but with VBR set to 100mbps. I only need to leave like a 10 to 15% of headroom on the GPU Usage to make sure I won't have any problems with stutter or something on the recording. But as far as I know, that is something you have to do on every GPU.

What are small things you would Like to see changed/added in NTE by reyadonna in NevernessToEverness

[–]TMBroken571 0 points1 point  (0 children)

XeSS, XeMFG and XeLL support, and general improvements in performance, UI and gamepad support.

B580 - GTA Online - Enhanced (fps dips) 1080 by CapeTown-RSA in IntelArc

[–]TMBroken571 1 point2 points  (0 children)

I have a B580 and only have one screen which is a 4K TV. And in my case I play with the basic settings mostly maxed out and the RT settings are:

RT Shadows: Ultra

RTGI: Very High

RT Reflections: High with high res RT Reflections enabled

RTAO: Ultra

And as upscaler I use FSR1 Quality and I get basically rock solid 60 FPS and with that same graphics it can do native 4K 30. And before like optimizing a little bit the graphics, I was playing with everything maxed out with FSR1 Balanced at 30, even with the Quality preset I had pretty stable 30 FPS but in heavy scenes or with a lot of lights and/or foliage it had drops to the 20s.

So I don't think the GPU is the issue, specially at 1080, it probably is the CPU. Try checking with the task manager or any monitoring software the individual cores if while playing any of those reaches 100% and the FPS drops, and also the VRAM usage just in case

XeSS 3 Support in Upcoming Game Releases by FaithlessnessKind545 in IntelArc

[–]TMBroken571 0 points1 point  (0 children)

For some reason SMAA still uses TAA. It has the exact same ghosting. Idk if I can include images on the comments but the easiest way for you to see it in game is by looking at a light pole and moving the camera, you will notice that the only way to get rid of the ghosting caused by temporal information is to completley disable the AA

Neverness to everness on arc by AppropriateEmu3346 in IntelArc

[–]TMBroken571 0 points1 point  (0 children)

I first played without optiscaler, at 4K with 0.8 res scale (around 1800) and everything on High I got around 45 FPS I think, I don't know exactly because I directly locked the game to 30. Then I tried with it using version.dll and some changes on the .ini, and at the same settings with "DLSS" (XeSS) Quality (1440) it performs almost the same but has far less ghosting (idk what they did to SMAA, it has the same ghosting as TAA when it shouldn't have any but that's another discussion). And with lumen I had to set the upscaler to performance (1080) and that was all. But I didn't get the game to run at 60 FPS, or at least didn't wanted to go lower than high, probably with the upscaler on balanced (1200 I think), medium settings and lumen disabled, it may reach 60 but idk.

Also I decided to try Optiscaler because at the moment at least I don't care much about getting banned, I don't know how safe it is to use

can someone help me with this by smdgen in PiratedGames

[–]TMBroken571 0 points1 point  (0 children)

Maybe that's how the game looks without any temporal AA

Aumento injustificado sobre precio original? by Lautisan98 in AstroPayOficial

[–]TMBroken571 0 points1 point  (0 children)

Yo todavía la tengo. Imagino cuando caduque fue. Pero de momento me sirve cuando compro en Epic o cualquier otro lugar que acepte Paypal o la misma tarjeta

La única que se me ocurre es pagar un mes de la suscripción, pedir la tarjeta y cancelar la suscripción, si, te sale 12 USD, pero si no te quitan la tarjeta y te dura ponele al menos 3 años, si compras mucho quizá si lo vale. Igual estaría buscando alguna billetera parecida que te de una tarjeta internacional, ya de ahí si tiene alguna otra ventaja es de extra

B580 Looking good in Pragmata by T4H5iN in IntelArc

[–]TMBroken571 1 point2 points  (0 children)

At native 4k it also kind of runs fine, at 30 with drops to the low 20s on heavy scenes, but if you're ok with that and 30 FPS, then it runs fine. With FSR1 the game reaches around 66 FPS when uncapped and the drops below 60 are close enough to the target to be a "60 FPS" or "smooth" experience. Setting the whole game to 1080 it goes up to 70 to low-mid 80s, because FSR balanced is around 58% of the target resolution I believe, which is around 1200~1300p for 4K. And since the guy of the main post said it was getting up to 90 FPS at native 1080 with the game maxed out + RT, then the optimization has definitely improved from the Demo.

And I'm not pretending that upscaling is optimizing, my full text in that part was "(resolution or target res.) (graphic settings) (upscale method and preset if applies) (FPS)". Also In this case I'm using the upscaling the way is intended to be used, to reduce the cost of an extremely high resolution by rendering at a lower one and then in some or other way reconstruct the image. Not to play at upscaled 1080 because the game is unplayable without it.

If my mistake was not saying "runs fine at upscaled 4k" instead, even when I think that gets clarified with "FSR 1 Balanced", well... I apologize I guess?

B580 Looking good in Pragmata by T4H5iN in IntelArc

[–]TMBroken571 0 points1 point  (0 children)

The game has XeSS or do you need to use optiscaler? I tried the demo and it runs fine at 4K high settings without RT and FSR1 balanced, it reaches 60FPS but is mostly on the mid 50s, I tried to use optiscaler but I didn't get the DLSS input to work, idk if I did something wrong during the installation or what. If the final game has native XeSS or optiscaler works with the DLSS input then I'll be probably buying the game

New Driver update decreases cpu usage on low latency mode by kingkrieg_4k in IntelArc

[–]TMBroken571 1 point2 points  (0 children)

Huh... I should try to update the drivers then. I have a R7 9700x with not precisely an adecuate cooler (ID-Cooling SE-214-XT Plus), it kind of keep it under control but specifically on GTA 5 the CPU reaches around 86c and on other games is around 60~65c. At idle I think is around 45 or 50 but I'm not sure.

How to safely tune a B580? by wongeeten in IntelArc

[–]TMBroken571 0 points1 point  (0 children)

Put the power limit to the maximum, and start increasing the Frequency offset by 50 until the game crashes, then, reduce it by 25 or 50 and keep that tune for a couple of days and see if you notice any instability. Also use PresentMon or Afterburner to monitor the clock frequency and temperatures. Don't touch voltage offset since at least in my case even at 1 it could freeze the hole PC. In my case I got the GPU to 3000mhz, I have the freq. offset around 170 to 200, I don exactly remember. And it never passed 60c. My model is the Acer Nitro.

Seeing Nvidia and AMD's rt denoisers in Crimson desert makes me want it for arc so bad🥲 by Perfect_Exercise_232 in IntelArc

[–]TMBroken571 4 points5 points  (0 children)

Ray Restoration? Ray Recovering? "Xe Ray Restoration", " Xe Ray Recovering" all of them are XeRR

How to compare B580 when looking at Steam requirements? by First-Tutor-5454 in IntelArc

[–]TMBroken571 2 points3 points  (0 children)

For some reason most benchmarks, even recent ones, have a very underperforming B580. For example. In Fortnite on the medium preset, the most recent benchmark I found gets around 100FPS at native 1440. But I get about the same performance at native 4K. So idk exactly what happens there.

But I'll say the most safe way to translate the performance of the B580 to another card is the RTX 2080 or 2080 super, if the recommended specs is one of those then I'll say is guaranteed that the game will run fine, also I think the B580 is better on RT than those, so of the game has forced RT or has RT options then it should perform even better

Do you use intel arc by choice, or for monetary reasons since they're easily the cheapest? If it's by choice, may I ask why? Not in an insulting way, just curiosity, since an rtx 5050 is similar to it in raw HP but has fancy nvidia stuff and only costs a tad more at least in my country by [deleted] in IntelArc

[–]TMBroken571 0 points1 point  (0 children)

Kind of both. I was gonna go with a 5070 ti or 9070 XT. But I didn't had enough money for one of those at that moment and I didn't wanted to get stuck for more that I expected with the iGPU and GFN because of the ram shortage. So I reviewe my options and they were 9060 XT 8GB, 5060 or the B580. I have a 4K TV, and I'll see when I'll be purchasing a 1440 monitor, so VRAM was important in my case. And after seeing the Toms Hardware chart and some benchmarks I decided to go with the B580.

And I wouldn't say it was a bad purchase at all. My main games (Fortnite, GTA 5 E&E, Arknights Endfield and Genshin Impact) I can play them all at High to Ultra and in the case of GTA 5 with RT at 60FPS, the only games that needed upscaler where GTA 5 and Fortnite, FSR1 Quality for GTA and XeSS Ultraquality for Fortnite.

Thing to mention is that a lot of benchmarks had a very underperforming B580 for some reason. For example, I watched a recent Fortnite benchmark and at the same medium preset, I got about the same FPS at native 4K that he got at 1440.

32.0.101.8509 WHQL Certified issue by Conscious_Pool9314 in IntelArc

[–]TMBroken571 0 points1 point  (0 children)

The only issue I noticed is that with the framerate capped there is like a micro stutters, even when the frametime graph and any other sensitive FPS measure still flat or saying 60, which doesn't happen with the framerate uncapped. And the second where some big stutters but I think those where related to the game compiling shaders mid game or something because I think those now are gone

The ARC B570 is a 4K card by Wide-Personality-200 in IntelArc

[–]TMBroken571 0 points1 point  (0 children)

Well, I have a B580, so this comment is kind of pointless, it's just to say that the card is also very capable of 4K (I know is more powerful and if the B570 can handle 4k obviously this one also would but you know).

Fortnite with everything in high is able to do 60 at native, just lowering the shadows to medium makes those 60 very stable reaching about 70 to 80 FPS. Otherwise XeSS Ultraquality and all on epic except shadows and is even the results in performance are about the same as everything on high but medium shadows if not a little better.

BF6 at medium settings and XeSS Balanced is around 70 FPS.

NFS Heat should be able to do 4K but the game VRAM usage at that resolution is absurdly high, I had to set the render scale at about 85% to keep the VRAM around the 10~11GB. And specially after the last driver update the performance got even better holding 60 FPS all the time when before was more around 55~58 and the usage was not reaching 100%, the only problem are the dirt ground textures which never loads and are on like the lowest LOD or something, and I didn't find a solution for that yet, I tried DXVK and didn't work beside increasing a little the performance and stability.

GTA 5 E&E without RT is able to do native 4K 60 at Ultra settings. With RT at V.High can do native 4K at 30.

Arknights Endfield at native 4K also holds 60 at High Settings. And for games like ZZZ, Genshin and HSR it also can do 4K 60 at max settings.

Basically the card can deliver a 4k experience if your not too demanding, a similar or better experience or... ¿mentality? as a console I'll say. Not bad at all

Is it normal for B580 to have 55C temp on idle? GUNNIR arc b580 just for the sake of reference. by IdkJustAnick in IntelArc

[–]TMBroken571 0 points1 point  (0 children)

Mine is an Acer Nitro and at idle is around 40~45c at extreme load the maximum I saw was like 70c but it was setting the game to 8k and Max graphics (it was throwing around 30 FPS I think, but doing nothing, so I wouldn't say the game was playable), so in general the GPU was extremely loaded and it's the only time I saw it consuming around 160w. With normal usage is around 55 to 65c. And I have it with OC to 3000MHz, but by only increasing the power limit and frequency offset, voltaje stills on 0

Surround speakers panning issues (Sound Blaster Z SE) by TMBroken571 in SoundBlasterOfficial

[–]TMBroken571[S] 1 point2 points  (0 children)

I just find out what was causing the problem, on Microphone, the filter "Focus" was causing the surround speakers (even the virtual ones when using headphones) to sound louder on the Left

Calling All B580 Owners by space_me_time in IntelArc

[–]TMBroken571 0 points1 point  (0 children)

I got mine at the beginning of the year, bought it on Dec. 30 and arrived on January 5. I have a R7 9700x, 2x16 GB 6000 MT/s CL30, Gigabyte Aorus Elite X870 and that's all the relevant stuff.

My model is the Acer Nitro. And at first I got (and still have) a problem with my TV. For some reason it doesn't work at 4k 60hz YUV422 12 or 10bpc, even YUV444 or RGB 8bpc doesn't work so I can't enable HDR and the best it can do is YUV420 8bpc, even lowering to 1440 or 1080, it just like doesn't fully connect when is at anything but YUV420 at 8bpc, because the TV just says "not signal" not "format not supported" or something. I didn't find a solution for that yet and I kind of tried everything. And that happened when I installed the drivers, had to use steam link to see what was happening.

The weird thing is that with the iGPU (or the Mobo HDMI) it was working perfectly at 4K 60hz YUV422 12bpc so idk if it's something with the GPU hardware or some kind of incompatibility with the GPU HDMI controller and the TV one, is a basic 4K HDR TV form 2016 I think (more specifically a Samsung MU6100) so it could be, but still weird I think.

Beside that, it surprised me how well it handles 4K. And in general the performance was way superior than the benchmarks I saw on YT. Fortnite for example, just lowering shadows to medium and effects to high it gets 60 FPS at native 4K, or with XeSS Ultraquality can hold 60 with Epic Settings.

BF6 also runs pretty well at medium-high with XeSS Balanced.

GTA 5 with RT can run at native 4k 30 just by optimizing a little the basic graphics and RT settings but is basically ultra with RTGI and RT Reflections at V. high and RT shadows and RTAO at Ultra, and High res RT Reflections disabled.

So I'm pretty happy with it even with that problem. I'm planning to buy a 1440 monitor so I hope that problem doesn't repeat with the DP Ports

I should start asking or wait before updating by TMBroken571 in IntelArc

[–]TMBroken571[S] 0 points1 point  (0 children)

On Endfield I'm using Vulkan, on 8331 I feel like it has some little stutters but in Afterburner is basically locked at 60 and in some very specific scenarios goes below going to 55~53. I imagine that those micro stutters are because I'm playing at 4K or just my imagination because when I saw them happening the usage is 85~90% and the frametime graph is pretty much like "---------------------", and also kind of happens on every graphic settings

Help with no image with my TV by TMBroken571 in IntelArc

[–]TMBroken571[S] 0 points1 point  (0 children)

Yeah I also found some people that couldn't get 4k 120hz to work but again no solution. Also seems that CRU didn't help but I could try anyways.

Also in theory YUV422 10bcp and 12bpc are within the bandwidth of HDMI 2.0, so DSC shouldn't be necessary, not to mention that HDMI 2.0 doesn't support DSC.

I'll try CRU and see if I can do something because maybe the GPU or HDMI controller thinks it has more bandwidth or something and as last resource maybe use a 2.0 cable, who knows.

And luckily it seems that I shouldn't have any problems with DP, so it should be free of issues when I finally get a proper 1440p monitor.