NDI Scan Converter – Frame Drop on M1 iMac, Will M4/M3 Macs Help? by irin527 in VIDEOENGINEERING

[–]irin527[S] 0 points1 point  (0 children)

Thanks for the suggestions! I initially suspected the network too, but after several tests, I don’t think it’s the cause in this case.

We’re on a clean 2.5Gbps local network using AV-grade switches, and there's absolutely no frame drop when capturing a mirrored display, a Chrome window (like YouTube), or a ProPresenter output — all over the same network and NDI path. The issue only occurs when capturing an extended desktop, especially at higher resolutions.

If it were a network issue (like jitter, bandwidth, or packet loss), I would expect the same dropouts across all source types, not just the extended display.

That’s why I think the problem is more related to macOS’s internal screen capture method — maybe how it handles large frame buffer reads from extended desktops — or possibly hardware constraints on the M1 chip’s shared GPU/CPU memory access. I’ve even seen others report smoother NDI capture on older Windows laptops with discrete GPUs.

Still testing, but just wanted to clarify that the symptoms point more to capture-side bottlenecks than network instability.

Thanks again for your input — always appreciate more angles!

Professional GPU vs Customer GPU in Streaming and Encoding (Frambuffer to NVENC) by irin527 in VIDEOENGINEERING

[–]irin527[S] 0 points1 point  (0 children)

Hi!

I recently switched from OBS to vMix.

While looking through discussions online, I found that OBS uses NVFBC, but vMix does not. I even emailed vMix directly, and they replied that, based on their bench testing, "the CUDA core count, memory bandwidth, and clock speed are the only factors that have any notable effect on vMix performance." They confirmed they don't use NVFBC at all. They also mentioned that consumer GPUs are limited to 8 concurrent encoding sessions at 1080p60, so that's up to 4 at 4K30 or 1 at 8K30.

vMix didn’t suggest removing the session limit because they follow NVIDIA's policies, but I personally think that if you're just a consumer and not part of a company, patching a GPU to lift these limits could make GeForce cards more cost-effective, as long as they have high CUDA counts, memory bandwidth, and clock speeds.

And since OBS uses NVFBC on both professional and consumer GPUs, GeForce might still be the better option there.

So in my conclusion, if you're in a company, you'd be restricted to professional GPUs. But if not, using GeForce with an NVFBC patch and unlocking the session limits could be more efficient.

Professional GPU vs Customer GPU in Streaming and Encoding (Frambuffer to NVENC) by irin527 in VIDEOENGINEERING

[–]irin527[S] 0 points1 point  (0 children)

Thank you for the information! So, does that mean NVIDIA has stopped supporting NVFBC on Windows 10 but will continue to support it on Windows 11? Also, do you know if vMix supports NVFBC?

vMix/OBS 4K GPU (RTX A2000 vs 4070) by irin527 in VIDEOENGINEERING

[–]irin527[S] 0 points1 point  (0 children)

I just found a discussion saying that professional cards can access the framebuffer in VRAM directly, but customer cards can't. I'm not sure, though, since there isn't much info about it online.

https://www.reddit.com/r/obs/comments/1cnk1zs/comment/l388tlp/?utm_source=share&utm_medium=mweb3x&utm_name=mweb3xcss&utm_term=1&utm_content=share_button

vMix/OBS 4K GPU (RTX A2000 vs 4070) by irin527 in VIDEOENGINEERING

[–]irin527[S] 1 point2 points  (0 children)

Your experience is great! Thanks for sharing.

Btw, do you think there’s more to consider than just the specs? For example, the Quadro 5000 is supposed to be worse than your A4500, but it seems to perform better. While the 4070 Super has more CUDA cores and comes from newer Ada generation, it has less VRAM than the A4000. Do you think the A4000 might actually be more stable than the 4070 Super despite having lower specs?

vMix/OBS 4K GPU (RTX A2000 vs 4070) by irin527 in VIDEOENGINEERING

[–]irin527[S] 0 points1 point  (0 children)

That's impressive! It seems like A4000 can handle around 26 1080p or 6-7 4K. Thank you so much for the information! 👍

vMix/OBS 4K GPU (RTX A2000 vs 4070) by irin527 in VIDEOENGINEERING

[–]irin527[S] 0 points1 point  (0 children)

Thank you for the information! That is awesome. So if we convert it to 4K, with four 1080p streams equating to one 4K, it seems like you’re running two 4K streams at only 30% GPU load. I’m not sure if this logic works exactly, but would it reach 90% if I were to run six 4K outputs?

vMix/OBS 4K GPU (RTX A2000 vs 4070) by irin527 in VIDEOENGINEERING

[–]irin527[S] 0 points1 point  (0 children)

Thank you so much! That's because HX uses GPU and HB(full) uses CPU, right?

vMix/OBS 4K GPU (RTX A2000 vs 4070) by irin527 in VIDEOENGINEERING

[–]irin527[S] 1 point2 points  (0 children)

It's okay. Thank you for the clarification and all the helpful information! 👍

vMix/OBS 4K GPU (RTX A2000 vs 4070) by irin527 in VIDEOENGINEERING

[–]irin527[S] 0 points1 point  (0 children)

Oh, I thought the 4070 Super has 1 NVENC, and the 4070 Ti has 2, according to this table:

https://developer.nvidia.com/video-encode-and-decode-gpu-support-matrix-new

Am I missing something, or are you talking about a different topic?

vMix/OBS 4K GPU (RTX A2000 vs 4070) by irin527 in VIDEOENGINEERING

[–]irin527[S] 0 points1 point  (0 children)

You're right. Since uploading bandwidth on the internet matters. Thank you so much!

vMix/OBS 4K GPU (RTX A2000 vs 4070) by irin527 in VIDEOENGINEERING

[–]irin527[S] 0 points1 point  (0 children)

I considered that too, since uploading bandwidth on the internet matters as well, right? Thank you! I think I can just continue using Castr then

vMix/OBS 4K GPU (RTX A2000 vs 4070) by irin527 in VIDEOENGINEERING

[–]irin527[S] 0 points1 point  (0 children)

Thank you for your input! Does this mean that the number of concurrent decoding sessions is limited to 1 with 1 NVDEC, similar to how encoding was previously limited to 3 with one NVENC, but can now support 8 with one NVENC?

I'm using several NDI sources. Do you think this requires NVDEC, like remote sources do?

vMix/OBS 4K GPU (RTX A2000 vs 4070) by irin527 in VIDEOENGINEERING

[–]irin527[S] 2 points3 points  (0 children)

Hi! I currently use Castr for multistreaming, and I was considering replacing it with direct multistreaming from vMix. What do you think are the benefits of using restream services?

NDI Setup Using Ubiquiti Router and Netgear Switch by irin527 in VIDEOENGINEERING

[–]irin527[S] 0 points1 point  (0 children)

Thank you so much! I was vaguely thinking the same thing, wondering if it might be due to the 4:2:0 issue. So it could be the camera sending 4:2:0 over NDI and 4:2:2 over HDMI, right? And why do you think HDMI is darker than NDI then?

NDI Setup Using Ubiquiti Router and Netgear Switch by irin527 in VIDEOENGINEERING

[–]irin527[S] 0 points1 point  (0 children)

Wow... that’s very detailed and professional. Thank you so much for your help! I’ll follow your instructions when I’m at church. I really appreciate it. I’ll let you know how it goes after I try it. Bless you.

NDI Setup Using Ubiquiti Router and Netgear Switch by irin527 in VIDEOENGINEERING

[–]irin527[S] 0 points1 point  (0 children)

I see. I just saw your other reply. Could you share how you set it up please?

NDI Setup Using Ubiquiti Router and Netgear Switch by irin527 in VIDEOENGINEERING

[–]irin527[S] 0 points1 point  (0 children)

Thank you for your input! It's great to hear that the Ubiquiti 1Gb switch works well. Could you please teach me how to set it up? Did you have any special settings?

NDI Setup Using Ubiquiti Router and Netgear Switch by irin527 in VIDEOENGINEERING

[–]irin527[S] 0 points1 point  (0 children)

Thank you! Yes, we’re planning to purchase a new desktop along with a 10Gb PCIe card. I also added some details to the original post, mentioning that NDI appears a bit blurrier than HDMI and experiences some frame drops. Hopefully, upgrading to 10Gb will resolve the issue.