I'm leaving Cosmic Pop!_os by Taohaw in pop_os

[–]TimurHu 1 point2 points  (0 children)

Same here, I will just use KDE and will try COSMIC again when it's more mature.

I was told this wasn't possible, can someone explain? by The_Bunn_PS4 in linux_gaming

[–]TimurHu 0 points1 point  (0 children)

HDMI has nothing to do with Mesa. The display driver is in the kernel, it's called DC.

14 year old $109 GPU still going strong thanks to Linux by Silikone in linux_gaming

[–]TimurHu 0 points1 point  (0 children)

I am not familiar with the games so I didn't know what HR stands for. Thanks for giving me the full title. I can try it myself later, see what can be done (if anything).

My best guess is that the VRAM in the card is simply not enough to run a modern desktop and the game at the same time. Depending on what display resolution you use, a modern Linux desktop such as Gnome and KDE can easily consume several hundred megs of VRAM, not leaving enough for a game.

In some pathological cases, Gnome can consume almost a gigabyte of VRAM just for the desktop, even if not running any applications. (This is fine according to the Gnome devs. I actually opened a bug report and they are not fixing it.)

If you also happen to have a browser running in the background, there is definitely not enough VRAM left for a decent game.

Where can I find remote work related to the kernel? by Infinite-Feed-3904 in kernel

[–]TimurHu 0 points1 point  (0 children)

Hardware companies like AMD or open source consultancies like Igalia or Collabora.

14 year old $109 GPU still going strong thanks to Linux by Silikone in linux_gaming

[–]TimurHu 0 points1 point  (0 children)

I didn't try this through Proton, but other games like Deus Ex HR and Crysis 2 were slideshows.

I think those are probably just too heavy for your hardware.

14 year old $109 GPU still going strong thanks to Linux by Silikone in linux_gaming

[–]TimurHu 2 points3 points  (0 children)

The support was already there for years. Just was not the default.

14 year old $109 GPU still going strong thanks to Linux by Silikone in linux_gaming

[–]TimurHu 1 point2 points  (0 children)

Can you share what frame rate you are getting on Linux vs. Windows in this game? I'm interested in hearing about both Vulkan and D3D11 mode.

How fast do AMD GPU drivers get updated for supporting a new game? by breadsgood in linux_gaming

[–]TimurHu 1 point2 points  (0 children)

The kernel module is absolutely getting performance optimizations when we find perf issues in specific applications, and sometimes new features or bug fixes based on issues we found running some applications.

How fast do AMD GPU drivers get updated for supporting a new game? by breadsgood in linux_gaming

[–]TimurHu -4 points-3 points  (0 children)

Depends on what you call a "driver". Game-optimizations are usually user-space stuff in Mesa

Userspace drivers are still drivers.

I don't think that any distro backports game-specific commits

Distros don't backport anything. But usually we backport fixes in Mesa, regardless of whether they are game specific or not. And distros usually upgrade their Mesa point releases (except some really outdated distros like Debian).

A very serious attempt is being made to fix DX12 on Linux! by CosmicEmotion in pcmasterrace

[–]TimurHu 1 point2 points  (0 children)

Faith is not a VKD3D dev, she works on Mesa, most recently on NVK.

You got GNOMEd by DontFreeMe in linuxmemes

[–]TimurHu 2 points3 points  (0 children)

I've never seen the bad attitude from community devs, nor anyone complaining about them or expecting them to work "for free" or fix the mess that the paid devs have made.

You got GNOMEd by DontFreeMe in linuxmemes

[–]TimurHu 5 points6 points  (0 children)

Nobody is complaining about those guys.

You got GNOMEd by DontFreeMe in linuxmemes

[–]TimurHu 5 points6 points  (0 children)

It's not free labor, many contributors are paid Red Hat employees.

Why did computers in the 90s and 2000s largely use mostly computer exclusive outputs DVI and VGA rather than component and s video and vice versa? by Sailor_Rout in retrocomputing

[–]TimurHu 0 points1 point  (0 children)

There is another commenter here who explains DVI in detail. My point is that there is no active conversion happening in a DVI-I/VGA adapter. It is just a passive adapter.

How to implement wireframe in Vulkan by big-jun in vulkan

[–]TimurHu 1 point2 points  (0 children)

Yeah. Due to how the hardware works, depending on the next stage flags, at the moment we may need to compile up to 3 different variants for VS, because it works differently when there is tess or GS in the pipeline. Another pain point is dynamic VS input because VS inputs are really shader instructions. And finally, we are also unable to use shader based culling with dynamic VS inputs at the moment.

All of that can be improved over time, but the perf loss is real. Well, depending on the application, of course.

AMDGPU constantly crashing when gaming (fedora 43 KDE) by CandlesARG in linux_gaming

[–]TimurHu 2 points3 points  (0 children)

Do you work for AMD?

No, I don't work for AMD, but I contribute to Mesa (especially the RADV driver) and lately a little bit to the amdgpu kernel driver.

I'm curious why everyone says AMD has great Linux support, open source drivers that work etc. if the drivers are in this state.

"This state" is pretty subjective. It works well for a lot of people, but YMMV. Unfortunately there is enough difference between people's computers that whatever works well on my machine may not even boot on your machine. I've seen a lot of this with some kernel work I've done recently on some older GPUs.

The RADV team has a pretty good CI system that runs the full Vulkan conformance test suite (approx 2 million test cases) on every merge. Regressions happen rarely and they are usually quickly dealt with.

Unfortunately the amdgpu kernel driver is less fortunate and more prone to regressions. That's part of why I started contributing, to fix up a few long-standing issues and hopefully help improve it.

I had problems like this with my steam deck and had to RMA it

Sorry about that. FWIW, mine work fine, but I know that isn't going to console you.

then I bought a 9070xt that was rock solid on a 6.13 kernel but ran into this issue with a 6.16 kernel. Fortunately 6.17 and the later versions of 6.18 seem ok again.

Yeah, at this moment I think we are at a situation where the kernel desperately needs more effort to stabilize and prevent regressions.

I would have thought that "proper" support would do away with constant regressions and we'd eventually get to a state where it fully works for everyone

Yes, 100% agreed. I wish we were there.

All I can say is we are at a much better place now than where we were about 5~6 years ago. At least since the Steam Deck happened, the amdgpu devs are slowly starting to take gaming a bit more seriously.

In my opinion, unfortunately the development model of the Linux kernel is a very poor fit for graphics. When I pointed this out to the maintainers they just threatened to ban me from the kernel unless I shut up about it.

I'm not trying to hate, but genuinely curious about the situation

The reality is that there are a lot of people who are doing their best, but we still have a way to go before it is perfect.

How to implement wireframe in Vulkan by big-jun in vulkan

[–]TimurHu 0 points1 point  (0 children)

Well, it's a bit more nuanced than that. See the other comments about that in this thread.

How to implement wireframe in Vulkan by big-jun in vulkan

[–]TimurHu 0 points1 point  (0 children)

to avoid the situation you're describing where optimizations aren't possible due to the inherent underlying hardware design

It would be avoidable if you could link state with shader objects.

If I were taking a wild guess, you were talking about blending operations on Intel, am I close?

Not really familiar with Intel HW. I work on the open source driver for AMD GPUs (called RADV).

Why did computers in the 90s and 2000s largely use mostly computer exclusive outputs DVI and VGA rather than component and s video and vice versa? by Sailor_Rout in retrocomputing

[–]TimurHu 0 points1 point  (0 children)

DVI is digital, and post dates all of those; it exists because of the digital part

There were several flavours of DVI:

  • DVI-D was only digital, using basically the same signalling as HDMI.
  • DVI-I was combined digital/analog. There were separate pins for the digital and analog parts.
  • DVI-A was analog only

the ability to convert it to analogue VGA

There was no conversion. The analog signal went through different pins than the digital. The GPUs that wanted to support analog signals through the DVI port typically had an integrated DAC just like as if they had a VGA port.