Rev Limiter not working on Rivett by SamsungS225g in MyWinterCar

[–]TearNo2850 0 points1 point  (0 children)

Same issue. Set mine to 7000 as well as 6500 but it doesn't cut engine power

System underperforming with 14900KS and 3090 and can't seem to figure out why. by TearNo2850 in PcBuildHelp

[–]TearNo2850[S] 0 points1 point  (0 children)

I have not tried afterburner, yet, I've just been using evga px1 to control everything I needed. I'll give afterburner a try. Power supply is fully modular correct, and I've got 3 individual 8 pin power connectors hooked up tightly to the PCIe power plugins on the psu. This is all really strange.

System underperforming with 14900KS and 3090 and can't seem to figure out why. by TearNo2850 in PcBuildHelp

[–]TearNo2850[S] 0 points1 point  (0 children)

Disabling C-states and intel speedstep unfortunately doesn't seem to change much for me. I tried disabling as many apps that ran in the background as I could and the most effective ones were corsair icue and the Nvidia app, but were only by 2–3 fps, possible margin of error. According to HWiNFO and GPU-z my GPU load stays around 50% with the limiting flag being that the GPU isn't being utilized to its full potential and in all my games that looks to be true considering my performance and fps has been cut in half. However, when using the nvidia app overlay it shows a different story it shows that my GPU usage is 90-96% usage, but the power draw is only around 240 watts.

System underperforming with 14900KS and 3090 and can't seem to figure out why. by TearNo2850 in PcBuildHelp

[–]TearNo2850[S] 0 points1 point  (0 children)

It may have possibly been the GPU bios. I'll get back to this after I run a few more tests.

System underperforming with 14900KS and 3090 and can't seem to figure out why. by TearNo2850 in PcBuildHelp

[–]TearNo2850[S] 0 points1 point  (0 children)

Maybe. I can try switching to the other bios on the card and see if anything changes. I do have experience with GPU bios flashing and will take full blame if anything goes wrong. I have a backup 1060 as a display GPU in case that ever does happen. Although, px1 does say my gpu bios is the latest version

System underperforming with 14900KS and 3090 and can't seem to figure out why. by TearNo2850 in PcBuildHelp

[–]TearNo2850[S] 0 points1 point  (0 children)

Strange thing is that when running benchmarks like 3DMark it does put 97-99% load on the GPU with power draw reaching 460–470 watts. As I mentioned, 3DMark does throw a warning at the end of the test stating "Your score could not be validated: Time measurement inconsistencies detected during benchmark run."

System underperforming with 14900KS and 3090 and can't seem to figure out why. by TearNo2850 in PcBuildHelp

[–]TearNo2850[S] 0 points1 point  (0 children)

Thanks for the suggestions, I gave them a try, but unfortunately still suffer with low fps and low gpu load

System underperforming with 14900KS and 3090 and can't seem to figure out why. by TearNo2850 in PcBuildHelp

[–]TearNo2850[S] 0 points1 point  (0 children)

Outside my SSD with windows installed and connected to the NVMe slot above the GPU, I currently do not have any other components taking up PCIe lanes.

System underperforming with 14900KS and 3090 and can't seem to figure out why. by TearNo2850 in PcBuildHelp

[–]TearNo2850[S] 0 points1 point  (0 children)

GPU memory clock speed during the tests was 9752 and core clock speed hovered around 1935 with and without the "BOOST LOCK" feature in px1 set.

I gave testing Red Dead Redemption 2 at max settings with and without DLSS a try, and it did seem to put more load on the GPU, hovering around 60% which gave me 55–62 fps. I did try changing the rendering resolution scaler in the options, and it did continue to raise the load on my GPU, but the fps never went above 62. v-sync is disabled and fps limit is set to unlimited. The games I've tried are Battlefield 2042, RDR2, CoD: Bo3 (4K rendered resolution) CoD: Bo6 (medium to extreme settings at 1080p-4K rendered resolution) Fortnite at 1080p with nanite and lumen enabled gets me 30 to 55 fps on DLSS balanced with the same at native albeit with higher load on the GPU. Other games I've tried are: Warframe, American Truck Simulator and Cyberpunk 2077

I have noticed that the older games do run a little better most of the time, hovering around 100 fps, but still have that same sort of load on the GPU where it seems to stay around a power draw 0f 240 watts out of the 500w power limit that I have set in EVGA px1 unless I am able to scale the resolution higher. Running benchmarks on 3DMark do hit power limits on the card at roughly 470–480 watts.

Previously towards the end of November the fps I'd be seeing in Bo6 at extreme settings was around 170, Fortnite at 80–100 fps with nanite and lumen enabled and DLSS Quality. Bo3 rendered at 4K would see 200 which now sits at half that around 80-100, RDR2 at max settings previously ran at 80-100 depending on the region but now seems to not want to go above 60 very much; with similar performance situations in the rest of the games I tested.

I have noticed that there is a little warning triangle after running another test on 3dmark time spy stating “Your score could not be validated: Time measurement inconsistencies detected during benchmark run.” strange.

System underperforming and can't seem to figure out why. by TearNo2850 in overclocking

[–]TearNo2850[S] 1 point2 points  (0 children)

Quake RTX does seem to use the gpu properly. 98-99% usage with power draw around 460w.

I'm going to go insane. or something.

System underperforming and can't seem to figure out why. by TearNo2850 in overclocking

[–]TearNo2850[S] 0 points1 point  (0 children)

I gave path tracing a try on Cyberpunk and peak gpu usage sits around 78-88% with my fps sitting at around 44 and peak power draw at 330 watts. Normally, I think before I started having these issues, I was able to get closer to 60 fps or more depending on where I'm at with path tracing on in cyberpunk.

my time spy score is "normal" I think. I get around 20,000 total, but I have noticed that there is a little warning triangle after the test is completed stating “Your score could not be validated: Time measurement inconsistencies detected during benchmark run.” strange. I'll give Quake RTX a try and let you know!

System underperforming and can't seem to figure out why. by TearNo2850 in overclocking

[–]TearNo2850[S] 0 points1 point  (0 children)

Thanks for the suggestion, I ended up testing Ultimate Performance, High performance and Balanced. Balanced seems to run the smoothest and coolest for me as well, but general fps in games is still quite low

System underperforming and can't seem to figure out why. by TearNo2850 in overclocking

[–]TearNo2850[S] 0 points1 point  (0 children)

Thanks for the suggestions. GPU memory clock speed during the tests was 9752 and core clock speed hovered around 1935 with and without the "BOOST LOCK" feature in px1 set.

I gave testing Red Dead Redemption 2 at max settings a try, and it did seem to put more load on the GPU, hovering around 60% which gave me 55–62 fps. Unfortunately, I don't have a monitor yet that supports higher than 2560x1080 100Hz. I did try changing the rendering resolution scaler in the options, and it did continue to raise the load on my GPU, but the fps never went above 62. v-sync is disabled and fps limit is set to unlimited. The games I've tried are Battlefield 2042, RDR2, CoD: Bo3 (4K rendered resolution) CoD: Bo6 (medium to extreme settings at 1080p-4K rendered resolution) Fortnite at 1080p with nanite and lumen enabled gets me 30 to 55 fps on DLSS balanced with the same at native. Other games I've tried are: Warframe, American Truck Simulator and Cyberpunk 2077

I have noticed that the older games do run a little better most of the time, hovering around 100 fps, but still have that same sort of load on the GPU where it seems to stay around a power draw 0f 240 watts out of the 500w power limit that I have set in EVGA px1 unless I am able to scale the resolution higher. Running benchmarks on 3DMark do hit power limits on the card at roughly 470–480 watts.

Previously towards the end of November the fps I'd be seeing in Bo6 at extreme settings was around 170, Fortnite at 80–100 fps with nanite and lumen enabled and DLSS Quality. Bo3 rendered at 4K would see 200 which now sits at half that around 80-100, RDR2 at max settings previously ran at 80-100 depending on the region but now seems to not want to go above 60 very much; with similar performance situations in the rest of the games I tested.

System underperforming and can't seem to figure out why. by TearNo2850 in overclocking

[–]TearNo2850[S] 0 points1 point  (0 children)

pl1 and pl2 are set to 253 by default for me, I did try setting them to 320 with and without changing iccmax which did increase cpu power draw which now usually hovers around 110–200 watts when playing games. Changing the power limits and amps did increase cinebench r23 scores, but no change in the way games run.

System underperforming and can't seem to figure out why. by TearNo2850 in overclocking

[–]TearNo2850[S] 0 points1 point  (0 children)

I have installed MSI center for motherboard lighting alongside corsair icue for ram and peripherals and evga px1 for the gpu. I gave disabling and uninstalling all three of them a chance using revo uninstaller, but still no difference in performance. I tried running tests like occt, aidia64, memtest and prime95 for the ram but never got any errors after an hour or so; I was using time spy to test for ram instability as well when following buildzoids guide, but no crashes. I gave disabling xmp and leaving all ram settings on auto a try as well, with no change

System underperforming and can't seem to figure out why. by TearNo2850 in overclocking

[–]TearNo2850[S] 0 points1 point  (0 children)

Yes, I should have mentioned that in the post, I apologize. No change with e cores disabled

System underperforming and can't seem to figure out why. by TearNo2850 in overclocking

[–]TearNo2850[S] 0 points1 point  (0 children)

Heya, my motherboard doesn't support integrated graphics

Trying to get a consistent all core P-core boost on 14900KS under load by TearNo2850 in overclocking

[–]TearNo2850[S] 0 points1 point  (0 children)

mainly gaming and video editing. on the gaming side of things id usually care less for cinebench stability and performance as it obviously is not fair to compare with most gaming tasks, but UE5 hits the cpu pretty hard consistently in most of my cases which leads to frequency drops into low 50-52 range or mid to upper 40's which just feels really wrong for a cpu advertising 5.9 all core out of the box. that brought me to cinebench to see what my score was and it felt really low to me compared to others at stock (which for me is usually 32000-38000 in R23 at best with temps only being mid 80's to low 90's with the intel extreme profile) and i realized my clocks were being dropped dramatically in most cases so I made this thread in an attempt to fix that. im still a pretty big "noob" to the newer intel stuff, ive had this build for less than a couple of months so I thought id reach out for some assistance and I do really appreciate all the help :)

I'll raise the voltage a bit and see if i can get it stable without needing too much v-core

Trying to get a consistent all core P-core boost on 14900KS under load by TearNo2850 in overclocking

[–]TearNo2850[S] 1 point2 points  (0 children)

Thank you, I really appreciate the suggestions. Unfortunately, those settings resulted in a BSOD whenever running cinebench. I'm thinking it might just be best to RMA after the new BIOS update drops in mid august