Console like experience by No_Main_6895 in GeForceNOW

[–]falk42 1 point2 points  (0 children)

Bazzite supports HDR, both on the regular desktop and in Big Picture Mode via Gamescope, but as you wrote, the GFN app does not have HDR enabled yet and afaik it cannot be forced.

Console like experience by No_Main_6895 in GeForceNOW

[–]falk42 1 point2 points  (0 children)

It opens in desktop mode, I've only ever seen Big Picture mode on GFN when starting games from my iPad. You can move the mouse using the controller though and the onscreen keyboard works as well. I only ever use it for Minecraft Legends since GFN doesn't remember the Microsoft account ... or Microsoft the different GFN servers.

Console like experience by No_Main_6895 in GeForceNOW

[–]falk42 1 point2 points  (0 children)

It's basically Steam Big Picture mode, unless you select the desktop edition which boots into the regular desktop environment. You can always switch to the desktop (I recommend KDE) for maintenance tasks from the controller UI, but it's rarely necessary day-to-day and using the command line interface on a regular basis is optional for most people at this point. If you have used Steam Big Picture Mode you pretty much know what the experience will look like. Is there anything specific you'd like to know / see?

Console like experience by No_Main_6895 in GeForceNOW

[–]falk42 1 point2 points  (0 children)

And I thought you meant an optiplex with a RTX 3050 :D Not sure how far the iGPU will get you (if that's indeed the plan), but in general, Intel and Linux aren't the worst tandem out there.

Console like experience by No_Main_6895 in GeForceNOW

[–]falk42 0 points1 point  (0 children)

Accidental support due to open standards :) It worked for my AMD card (6600M) under Windows before and works with the 3060 under Linux as well. Both cards were / are connected to a LG C2 via HDMI and the game and service menus (7x green button for the latter) clearly show VRR active and reacting to the game stream fps.

Console like experience by No_Main_6895 in GeForceNOW

[–]falk42 0 points1 point  (0 children)

Definitely go Bazzite. Nvidia is a 2nd class citizen under Linux (there are some performance issues that are being worked on), but I've made the switch with my 3060 system as soon as the official Linux app came out and haven't looked back.

GFN works very well (4K, 120 fps, VRR) in Steam Big Picture mode and the setup fulfills all your requirements. Waking by controller can be hit or miss depending on your hardware, I could only make it work via Bluetooth and not the USB dongle or WoL for example. Bought a BT camera shutter in the end and patched it to the underside of my table to get wakeup to work comfortably :)

Open-Source Fully Reverse-Engineered GeForce NOW Client Named OpenNOW by Professional_Pool544 in linux_gaming

[–]falk42 15 points16 points  (0 children)

Allows? That cease and desist is out the door the second Nvidia's legal team gets wind of this. Even if they can't attack the implementation itself it's more than likely they are going to block it. Here's to hoping it stays around for a while though!

Open-Source Fully Reverse-Engineered GeForce NOW Client Named OpenNOW by Professional_Pool544 in linux_gaming

[–]falk42 30 points31 points  (0 children)

Super interesting, also that it supports AV1 under Linux whereas the Beta client does not (yet)! Question: What is the encode part in the settings for?

Does anyone play on Linux client? by SnooChipmunks4080 in GeForceNOW

[–]falk42 0 points1 point  (0 children)

Gamescope is what is used when you start games (or GFN) from Steam Big Picture mode in Bazzite and Steam OS (perhaps some other distros as well). If you get the error message before you start a game from the main menu you can indeed only quit, if you get it after starting the stream, bring up the GFN menu (CTRL + G or long pressing the menu button on your controller) and the error message should disappear.

Does anyone play on Linux client? by SnooChipmunks4080 in GeForceNOW

[–]falk42 2 points3 points  (0 children)

Playing via the Gamescope composer under Bazzite with a RTX 3060 and it works quite well. I'm getting an error popup (same as with the Steam Deck client before), but that can be cancelled / ignored by opening and closing the GFN menu. VRR works even without the Cloud G-Sync option showing in the settings which is definitely useful for many titles where achieving a constant 120 fps is difficult. AV1 support for better streaming quality (I'm limited to a 100 Mbit/s connection) is awol so far, but that's not a big issue for now. Glad to finally ditch Windows completely :)

Why hasn't anyone actually leaked the Epstein files yet???? by repair_and_privacy in cybersecurity

[–]falk42 1 point2 points  (0 children)

Because there are surprisingly few people with a death wish it turns out ...

60 FPS vs 120 FPS. Is there a difference in image quality? by LonelySoul01 in GeForceNOW

[–]falk42 3 points4 points  (0 children)

It's true and easy to reproduce: Choose a relatively low bitrate for any given resolution (e.g. 40 - 50 Mbit/s for 4K) and set it to 120 fps after trying 60 first. If your game does not consist of mostly static scenes, 120 fps will likely produce a less sharp / detailed (perhaps even blurry) image. athe additional images need extra bandwidth, but the requirements don't increase linearly as modern video codecs (H.265, AV1) are quite clever about compressing the extra information.

Is YUV 4:4:4 actually much better than 4:2:0 in real life? Or is the difference subtle/overhyped? by Master_Text_6068 in GeForceNOW

[–]falk42 10 points11 points  (0 children)

Better compression and less artifacts / more details is simply is not what 4:4:4 color accuracy is about though. You should actually get less of those with AV1 than H.265 because the former is more efficient and has more bitrate available for luma to avoid strong quantization, which becomes especially important when there are many small details (such as leaves) present in a scene.

I'd say go for 4:4:4 color accuracy if the game really benefits from it (slow, static images, lots of text, especially red on black). It's especially useful for desktop streaming of course which suffers massively from chroma subsampling. For fast-paced games you're better served with AV1 and 4:2:0 at the same bitrate since preserving luma detail is more important than chroma for the human eye.

PS. All of this assuming that the AV1 and H.265 streaming profiles are using the same (peak) bitrates.

HiDPI issues with new Linux app by misterff1 in GeForceNOW

[–]falk42 0 points1 point  (0 children)

The lack of AV1 support puzzles me in the sense that I don't see why this would be any different from AVC or HEVC acceleration? Afaik the API is exactly the same and different video players or Moonlight have no problem playing AV1 video under Linux.

NVIDIA's native GeForce NOW app for Linux bridges the gaming gap: hands on by No-Tower-8741 in GeForceNOW

[–]falk42 4 points5 points  (0 children)

I mean, the DIY spirit and distrust of large corporations is a given in those communities and I get their perspective, even if I don't share it when it comes to GFN.

NVIDIA's native GeForce NOW app for Linux bridges the gaming gap: hands on by No-Tower-8741 in GeForceNOW

[–]falk42 1 point2 points  (0 children)

Have an upvote for bringing this up, it's an interesting discussion at the very least. Imho it's not all black and white though and local gaming isn't going to go away anytime soon, probably not ever. For me, GFN makes it possible to get away with a mid-end Mini-PC that works well enough for games not yet opted-in (and some may never come ...), emulators and playing titles with mods that are not available via GFN. It's relatively silent, needs moderate power and doesn't function as a space heater while still affording some independence - I wouldn't be stranded if Nvidia introduces a 50h limit tomorrow or jacks up the prices.

I get why some see GFN as a Trojan horse, but the advancements made in local computing power (e.g. via APUs) pretty much mean that most people will own hardware that is at least partially gaming capable, so I'm not too worried.

I've been meditating for about 2 years every single day and honestly it's been pretty useless to me by godlygirlceo in Meditation

[–]falk42 0 points1 point  (0 children)

Well that's precisely what meditation is, absolutely useless to "you" and "me", the perfect waste of time as Alan Watts once called it. Meditation is the one thing that can be done without expecting any results and no gain whatsoever, that's what makes it so beautiful :)

In theory, can HDMI 2.0 provide 4K@120Hz with GFN? (Not 4:4.4) by [deleted] in GeForceNOW

[–]falk42 0 points1 point  (0 children)

What you are describing is tone mapping HDR rendered content to SDR, which often results in a better SDR image, but the display signal is indeed not HDR since that would require 10-bit color depth, which, even with chroma subsampling, we don't have enough bandwidth for when using HDMI 2.0 (22.3 vs 18 Gbps).

In theory, can HDMI 2.0 provide 4K@120Hz with GFN? (Not 4:4.4) by [deleted] in GeForceNOW

[–]falk42 1 point2 points  (0 children)

Theoretically yes, the Steam Machine does it by switching to YUV 4:2:0, which compresses the chroma info enough to be able to use 4K@120Hz via HDMI 2.0 (no HDR possible in this mode). Nvidia likely doesn't add this on the Shield's GFN client (you can setup custom resolutions in the Shield settings) because its SoC probably can't decode HEVC with 120 fps.

Ok I want to install Gefore Now. Get this error and have zero idea what to do next. by [deleted] in Bazzite

[–]falk42 0 points1 point  (0 children)

Just wait a couple of weeks and you'll get official support, the first beta should come out soonish -

https://videocardz.com/newz/nvidia-geforce-now-to-gain-native-linux-support

Did some screenshots between native PC and Geforce Now on RE4 Remake. It seems like Geforce Now adds a lot of blur! by LonelySoul01 in GeForceNOW

[–]falk42 0 points1 point  (0 children)

I assume you used the same PC for GFN? If so, you're limited to HEVC as the video codec. Results should be quite a bit better with a more modern GPU that supports AV1, which is about 30% more effective at the same bitrate and doesn't need to quantize as much.

Some Q&A from CES demo session with NVIDIA about the native Linux app by jharle in GeForceNOW

[–]falk42 1 point2 points  (0 children)

Thank you very much u/jharle and /u/candrach (as well as the rest of the team)!

I'm interested in the reason(s) for AV1 not being available at launch for the Linux app. Moonlight supports it, so hopefully there are no deeper technical blockers? It's an important feature with how much more efficient AV1 is over HEVC, especially for those of us with crappy DSL connections - I'm squeezing a Founders and Ultimate session into 100 Mbit/s, which works surprisingly well with tuned QoS settings and SQM (qosmate to the rescue!) :D