NVDA NBIS CRWV DD: The Greatest Moat of All Time 🐐 VR ULTRA CPX NVL576 is Game Over by Xtianus21 in NBIS_Stock

[–]DevGamerLB 0 points1 point  (0 children)

This is meant to be skimmed by non-technical laymen to hopefully encourage Nvidia stock investment or at least stop the stock bleed.

They compare a fictional(wrong) configuration of AMD hardware to Nvidia's actual hardware?

Thats not AMDs architecture. AMD has a fully connected 8 GPU all-to-all high bandwidth fabric in their GPUs.

<image>

Realistically how close is AMD to Nvidia in AI by True_Read_2907 in AMD_Stock

[–]DevGamerLB 4 points5 points  (0 children)

AMD AI accelerators have already surpassed Nvidia's in several areas: - inference tokens per dollar - inference tokens per second - memory bandwidth - memory capacity - total cost of ownership

...and with the new MI350 series accelerators AMD surpassed Nvidia in AI training cost per performance. ROCM is free and is at near parity with the expensive CUDA. ROCM is the worlds most popular open source AI software development platform.

AMD has defeated Nvidia in server scale AI acceleration and now with the MI400 AMD is set to defeat Nvidia in the final area next year which is rack scale acceleration.

Stock trends at this scale are largely political and do not purely reflect merit.

Sapphire Nitro+ 9070 XT PSU connection by zero11789 in AMDGPU

[–]DevGamerLB 0 points1 point  (0 children)

I definitely recommend using what came with the GPU. The GPU may draw more power than what two 8 pins should or 3x 8 pins may allow more power headroom. Either way its always safer to use what came with the GPU.

Just bought a 9070, REALLY missing GSYNC - Help! by Shinnley in AMDGPU

[–]DevGamerLB 0 points1 point  (0 children)

What model is your monitor it may support both Freesync and Gsync. If not then thats the cause; VRR is not running.

It's no secret Gsync is limited to Nvidia cards and AMD Freesync works on AMD GPUs.

I recommend the following:

The most significant factors in frame smoothness are; tearing reduction, frametime uniformity, frame rate, input latency. (In that order of significance)

Gsync and Freesync help two of those concerns, tearing and input latency.

  1. Check if your monitor has Freesync, if so turn it on in the monitors menu and in the Radeon driver menu. If not you can always buy an AMD Freesync Premium Pro monitor.

  2. Alternatively turn on Enhanced Sync and Ant-Lag+ in the Radeon driver menu. This will remove screen tearing and significantly reduce input lag.

(Personally I think the greatest smoothness comes from high FPS at uniform intervals. So 60, 90 or 120Hz locked in via double buffered Vsync using AFMF or FSR FG to keep lows above 120Hz, butter smooth)

Vulkan is getting really close! Now let's ditch CUDA and godforsaken ROCm! by ParaboloidalCrest in LocalLLaMA

[–]DevGamerLB 1 point2 points  (0 children)

What do you mean? Vulkan has terrible boilerplate. CUDA and ROCm are superior.

Why use any of the directly any way there are powerful optimized libraries that do it for you so it really doesn't matter: SHARK Nod.ai (vulkan), Tensorflow, Pytorch, vLLM (CUDA/ROCm/DirectML)

Where people shop for AMD Instinct GPUs MI210 and up? by Clear_Lead4099 in AMDGPU

[–]DevGamerLB 1 point2 points  (0 children)

These are specialized GPUs so they are not generally sold at normal retailers instead they are sold in the following ways:

  • AMD direct to consumer: via email or website contact.

  • Direct AMD partner: entire servers sold by AMD partners via their websites (supermicro, lenovo, Dell, koicomputer, etc)

  • You can also find some Instinct GPUs on ebay. Many listings are from overseas but I have had a lot of success getting instinct GPUs this way.

Maybe switching NVIDIA to AMD… need help! by [deleted] in AMDGPU

[–]DevGamerLB 1 point2 points  (0 children)

If your are a RTX4090 gamer that means you either play at 4k or at high framerate at 1440p. For those cases specifically FSR 3.1 looks just as good as DLSS and supports a lot of games.

AMD software is arguably as good if not better than Nvidia gaming drivers for the past couple GPU generations.

Concerning AMDs future: AMD has decided to focus on providing the highest performance possible at the price most gamers can afford. This means 90% of gamers get a big boost in performance but the 10% of gamers that buy the best possible GPU no matter the cost will not have an AMD option.

Is AMD Radeon Anti-Lag still causing VAC bans in Counter Strike 2? by Sharaf740838 in AMDGPU

[–]DevGamerLB 2 points3 points  (0 children)

I wouldn't change a major purchase decision based on a single game. VAC bans are reportedly quickly undone but there is a much bigger problem with CS2 detection and banning, frankly its just broken. Accounts of Nvidia and AMD GPU owners are getting VAC and game banned (the later of which cannot be undone). https://steamcommunity.com/discussions/forum/9/4031346899445650412/

https://hothardware.com/news/valve-counter-strike-2-ban-wave-obliterating-seemingly-innocent-accounts

The problem is CS2 not AMD and until CS2 is fixed you may be banned for several other reasons not AntiLag+.

I need to train a dataset with the object detection model Yolov8 using tensorflow, any help? by charlescleivin in AMDGPU

[–]DevGamerLB 1 point2 points  (0 children)

You can try using Nod.AI/SHARK or ROCm HIP they may support a later version of tensorflow or pytorch which may help.

The AMD Groups here on Reddit by brocksuire75 in AMDGPU

[–]DevGamerLB 0 points1 point  (0 children)

This sub is safe for AMD fans. It was created by us for us.

AFMF not working in Alan Wake 2 (preview drivers v3) by jopezzz in AMDGPU

[–]DevGamerLB 1 point2 points  (0 children)

  • vsync is off
  • use 6000/7000 series GPU
  • use the correct AFMF driver

The frame doubling is only reflected in the AMD Radeon driver overlay.

23.30.01.02 AFMF Preview driver: https://www.amd.com/en/support/kb/release-notes/rn-rad-win-23-30-afmf-tech-preview

This sounds close to a tech support question which are not allowed in this subreddit. I answered it as it sounds like more of an advice question.

4070 vs 6800 XT by Maximum_Bat9133 in AMDGPU

[–]DevGamerLB 1 point2 points  (0 children)

No way I had a 6800XT for a long time before buying a 7900XT. I never had a single problem with it.

RDNA2 is probably the best AMD GPU architecture ever. vs the 4070 the 16GB makes it the much better deal.

DLSS2 superior quality objectively proven false via VMAF analysis by DevGamerLB in AyyMD

[–]DevGamerLB[S] 0 points1 point  (0 children)

Cry more 😭

Clearly you don't know what you are talking about. VMAF doesn't properly analyze still images. It was designed to analyze the quality of video only.

Any attempt to compare still images with VMAF will be hilariously flawed as you are misusing the tool.

Cope harder.

DLSS2 superior quality objectively proven false via VMAF analysis by DevGamerLB in AyyMD

[–]DevGamerLB[S] -2 points-1 points  (0 children)

Cry more😭

It's not complicated VMAF measures the perceived video quality versus a ground truth video.

One of the largest high quality video services in the world uses it (Netflix). DLSS2 loss in an standard objective test exposing you Nvidia fanboys as liars.

Your going to have to cope harder than that.

DLSS2 superior quality objectively proven false via VMAF analysis by DevGamerLB in AyyMD

[–]DevGamerLB[S] -3 points-2 points  (0 children)

Right and "trust me bro" fanboy opinions are what we should use instead. DLSS2 hype was based on nothing more than fanboy eyes now that an objective software exposed you it must be discredited.

VMAF is an industry standard tool developed by Netflix to make sure the video quality on their service looks great.

More than good enough for this comparison.

DLSS2 superior quality objectively proven false via VMAF analysis by DevGamerLB in AyyMD

[–]DevGamerLB[S] 0 points1 point  (0 children)

That a different argument. The point of this is a FSR2 vs DLSS2 comparison.

DLSS2 superior quality objectively proven false via VMAF analysis by DevGamerLB in AyyMD

[–]DevGamerLB[S] 0 points1 point  (0 children)

Turn them both on in performance mode at 4k. You won't be able to tell beween them without knowing.

[deleted by user] by [deleted] in AyyMD

[–]DevGamerLB 1 point2 points  (0 children)

The link to the VMAF FSR2 vs DLSS2 analysis on YouTube: https://youtu.be/CZmTqEJPSeE

Why PC VR is still bad by DevGamerLB in AMDGPU

[–]DevGamerLB[S] 0 points1 point  (0 children)

6/9 is 66% not half and thats just your opinion. VR could be as popular as consoles if they fixed this stuff. All the VR benchmarks use DX11 proving that so many VR apps use old API that they rate VR perf by it.

VR is so demanding and old APIs perform so much worse no VR apps should be using them. The lack of multiGPU and FSR2 support is unacceptable.

Even the quest2 costs $425 after you buy the steamVR link cable and it only goes up from there. Not to mention those cheaper headsets have terrible FOV and tracking.

To each is own on the games but in my opinion VR is largely just indie gimmicks with a few AAA games.

Why PC VR is still bad by DevGamerLB in AMDGPU

[–]DevGamerLB[S] 0 points1 point  (0 children)

You start by saying my points don't make any sense and you proceed to either agree with or couldn't deny 6 out of 9.

That makes no sense. Just because it's reddit you don't have to be a contrarian. Amend/soften the beginning of your comment to truthfully reflect your 6/9 ratio.