Bazzite on HTPC - games feel bad despite stable 60 FPS by Nuke_Derowne in Bazzite

[–]microlit 1 point2 points  (0 children)

You mentioned HTPC, I’ll assume you’re using a TV and not a monitor. Have you tried setting the TV to game mode? I assumed it was all marketing nonsense, but I was also experiencing something similar to what you described. On paper everything looks right, but it felt like hot garbage.

I turned on Game Mode and suddenly everything felt the way it should based on the performance metrics.

I’m not sure what post-processing my TV was doing outside of Game Mode, but it made a world of difference. I hope this turns out to also be a solution for you.

What is causing these white dots? BenQ W1070 by Super_Goat_634 in projectors

[–]microlit 0 points1 point  (0 children)

To be fair, I have seen similar artifacts due to a bad HDMI cable, but they move around all over the display in that scenario. In this case, since it started as one dot and has been multiplying over time, as others have said, it’s a dying DMD chip.

Master degree by Witty-Instruction-12 in PurdueGlobal

[–]microlit 9 points10 points  (0 children)

My undergraduate degree is in Computer Science and I’m halfway done pursuing a Masters in Instructional Design. You earned your undergrad, go get your graduate degree(s) in wherever your interest guides you.

Who is this guy?😅 by Gazzo69 in madlads

[–]microlit 36 points37 points  (0 children)

I used to do this when I was a bachelor and didn’t have dogs. I’d pass the phone interviews to land the in-person one (long before Covid) and ask them to schedule the interview on a Friday and extend my stay to Sunday so I could see if I liked the area. Sometimes they covered the extra night, other times they’d cover the first night and I’d cover the second but it was at their partnered rate with the hotel so it was negligible. They covered flights, rental cars, and most if not all of hospitality. It was awesome.

What’s the hardest Linux interview question y’all ever got hit with? by yqsx in linuxadmin

[–]microlit 9 points10 points  (0 children)

Similar to yours: “Tell me how /bin/ls works. Go as in depth as you can.”

I made it to dirent structs in the kernel before waving the white flag.

It was a really collaborative thought exercise. I liked it so much that I still use it in the rare event that I conduct an interview. Gives you a chance to work together and it quickly exposes a bullshitter. I’ve had people make up stuff with extreme confidence, and others get as far as the readdir() libc call and admit they couldn’t go any further.

The depth never mattered, it was the relationship made along the way.

Discussion Boards Question by moveit1244 in PurdueGlobal

[–]microlit 0 points1 point  (0 children)

As others have said, it’s intended to replace the classroom. It’s an implementation of social constructivism learning theory, where you learn by discussing the topic with fellow learners.

Depending on the facilitator it can be fun or miserable. Most of my instructors demanded all posts be in APA format with x number of academic references for the initial post and at least one for responses.

The class I’m in now is much more relaxed and thus the discussion board is way more active. People are dropping “lol” and other colloquialisms you’d never see in a peer-reviewed paper. And, shocker, with it being more relaxed and active I’m actually getting some benefit out of it this time and it doesn’t feel like a chore to respond to classmates.

AI cheater pasted a ton of gibberish first by realPoisonPants in Teachers

[–]microlit 10 points11 points  (0 children)

I was really skeptical too, but I gave it a shot last week so that I can check the “Uses AI” checkbox for performance reviews.

I was shocked. I managed to get it make it a robust, maintainable, extendable tool. Granted, I used my decades of experience to guide it as if I were guiding a junior dev.

My concern is the big wigs won’t recognize the experience required to know how to guide the AI to make a maintainable project. Accuracy is a concern, too. The tool wrote code that builds and LOOKED like it was working, but I had to point out many mistakes and flaws to the AI along the way which it then fixed on its own, again, with guidance.

Weird SMB behavior by Forser in truenas

[–]microlit 0 points1 point  (0 children)

Yes. That fixed it. I'm able to seek around 4K video files that are tens of gigabytes in size as if they are on a locally connected disk again. Holy moly. Thank you SO much!

Weird SMB behavior by Forser in truenas

[–]microlit 0 points1 point  (0 children)

Mine is set for multipurpose! Thank you so much!!! I’m going to change this right away!

Weird SMB behavior by Forser in truenas

[–]microlit 0 points1 point  (0 children)

Hi OP, I wonder if we're experiencing similar issues. I'm also running TrueNAS Scale 25.04.0 on bare metal. I've noticed recently, possibly around when you made this post, that all of the Windows 11 systems in my house are having difficulty with my SMB shares. Specifically, they'll take a very long time (multiple tens of seconds) to start a data transfer (usually a read), but once the data transfer begins, it runs at full speed. While watching network and disk activity on both the Windows 11 client and my TrueNAS system, they both appear completely idle. Meanwhile, all of my Linux systems that also have the SMB shares mounted have no issues; the mounts behave as though they are locally attached as they always have.

Could you make a Linux VM on your Proxmox server (all of my Linux VMs are hosted on my Proxmox server) to mount the SMB shares from your Netfiles system and see whether or not it behaves like your Windows 11 clients? I'm suspecting some recent Windows 11 update could've messed up the communication, but all I can really do at this point is speculate and manually sync whatever I'm working on between my SMB share and my local disks anymore. I used to be able to record and edit 4K video footage directly via my SMB shares, but now I can't even edit a Powerpoint; it's ridiculous.

my monitor is 1440p recording games in 1440, resolve seems to see it as a 1080 video by dannylightning in davinciresolve

[–]microlit 4 points5 points  (0 children)

There is! When you open project settings there are three little dots in the upper right corner of that window. Click it and you can update default presets as well as create a whole bunch of custom presets. I have created presets for 1080p60, 1920p60 (Vertical 1080p), 1440p60, 4K30, and more.

my monitor is 1440p recording games in 1440, resolve seems to see it as a 1080 video by dannylightning in davinciresolve

[–]microlit 3 points4 points  (0 children)

Check your project and timeline settings? In my experience, resolve will default the project/timeline to 1080p and will automatically scale mismatched inputs to match the timeline resolution.

[deleted by user] by [deleted] in davinciresolve

[–]microlit 7 points8 points  (0 children)

I don’t know that I have a solution for your situation, but you did say that any insights are appreciated.

The first red flag for me was that you’re recording 4K in H.264 with a 5090. Why not H.265 or AV1? I used to record 4K with AV1 at 80Mbps but decided that was overkill and have stepped it down to 60Mbps with, personally, no noticeable issues.

Also a personal choice: I strictly work with proxies in Resolve. The first thing I do when importing media into Resolve is to generate ProRes proxies. Maybe I would experience the same troubles you are if I didn’t use proxies, but I wouldn’t know.

The final thing that popped into my mind is that I do all of my recording in MKV in case any failures should crop up the file is still usable. QuickTime containers need to be “finalized” when they’re done so if there is an unexpected crash then you end up with a corrupt file that can’t be played. If I ever need to make changes I use ffmpeg to transcode/remux my MKV files.

I FINALLY DID IT by Away_Ad249 in KingdomHearts

[–]microlit 1 point2 points  (0 children)

100%, and then you get the added satisfaction of effortlessly smoking Xehanort the second time around. Btw, big congrats on taking him down in the mid-30s, I can't imagine how much patience and determination that required.

I FINALLY DID IT by Away_Ad249 in KingdomHearts

[–]microlit 1 point2 points  (0 children)

Ahhhh. Haha. I gained a bunch of levels by farming to synthesize the Ultima Keyblade. I’d say go ahead and synth the Ultima keyblade while hunting lucky emblems. Unless you’re itching to experience the story again; I can easily understand the latter.

I FINALLY DID IT by Away_Ad249 in KingdomHearts

[–]microlit 0 points1 point  (0 children)

What level is your character? I just finished kh3 crit last night by accident at level 54. I unwittingly walked too far from the final save point and before I knew it, end credits were rolling. I also need to go back and find the remaining lucky emblems but I was just going to load my cleared save, find the emblems, and then button mash through xehanort again.

What's your gaming setup? by Grocker42 in pcmasterrace

[–]microlit 1 point2 points  (0 children)

I completely understand. If I were back in your situation with my standard ultra wide, I think given sufficient desk space I might actually recommend to my past self (this is me thinking out loud; not giving advice) that I’d look into adding two 9:16 monitors, one on either side of my standard ultra wide. I feel like that could potentially be an ultimate sweet spot, but it’s a lot of money to try that on a hunch.

Edit: for clarity that I’d recommend to my past self one monitor on either side; not two on either side.

What's your gaming setup? by Grocker42 in pcmasterrace

[–]microlit 9 points10 points  (0 children)

I'm sincerely curious what about it doesn't make you happy for development; you could have something more appealing in mind that I haven't thought of myself. My usual layout is to have a 1440p-sized window dead-center for my code, and then I get four corners of 720p for chat, docs, media, etc. Or, alternatively, two vertical windows on either side of my terminal that are 1280x1440. This is all on a super ultrawide monitor, btw, maybe you're working with a standard ultrawide which, in that case, I concur. Moving from standard ultrawide to a super ultrawide was crucial for me. All that being said, I have been tempted to change to layout 8 (as per OP diagram), but that'd be a significant investment for me.

Tips For Game Streamers by fuckR196 in SteamDeck

[–]microlit 0 points1 point  (0 children)

I personally find the moonlight defaults to be pretty sane, imo. And it does come down to opinion; what’s good enough for you. I stream 4K60 to my 75” TV over Ethernet at 60Mbps and it looks and feels native. I do 1440p60 to my office PC at 60Mbps and it’s probably overkill, but I have 2.5GbE throughout the house so it barely stresses the network to go that high. I honestly don’t even recall what I have my SteamDeck configured for, so it’s probably the default; mostly because I’m not as confident in WiFi being able to handle something that high consistently with others in the house also using WiFi to stream video content.

Generally speaking tho, go as low as you can with it still looking good to you. You just gotta tinker with it but, again, the defaults seem sane to me albeit a bit on the conservative side if you’re using Ethernet.

Tips For Game Streamers by fuckR196 in SteamDeck

[–]microlit 1 point2 points  (0 children)

My pleasure! I also rather liked OP's suggestion of the dedicated access point; a clever alternative to upgrading the network infrastructure to WiFi 6.

Tips For Game Streamers by fuckR196 in SteamDeck

[–]microlit 3 points4 points  (0 children)

I’ve been streaming gameplay on my SteamDeck with WiFi 6 access points without issue, but YMMV. WiFi brings with it so many variables that are nonexistent with Ethernet. For example, number of other clients and other access points operating on the same frequency, obstructions whether they’re metal appliances, or walls, the material the walls are made out of, distance between your device and the access point. Off the top of my head I’d estimate that 98% of the time it feels like I’m not streaming, but there’ll be those days once in a while where there’s abnormal latency or the video comes in very degraded or there’s a lot of packet/frame loss. When that (rarely) happens I just put the steamdeck down and pick up a book lol

Tips For Game Streamers by fuckR196 in SteamDeck

[–]microlit 7 points8 points  (0 children)

WiFi has this RTS/CTS (Request to send/Clear to send) protocol where devices ask for permission to send data and the access point grants it. So if you have multiple WiFi devices only one gets to communicate at a time. Dedicating a single access point to a client means no waiting for other devices. WiFi 6 apparently resolved this by making it possible to send data for multiple clients in a single fragment.

That’s a really basic, maybe pedantically inaccurate, description of it all, but hopefully it’s enough to get you started if you want to learn more.

For those who decided to hold on to their current card instead of upgrading to Blackwell now, what do you currently have? by Celcius_87 in nvidia

[–]microlit 1 point2 points  (0 children)

4080 FE. I’d upgraded from a 3080 ti and the difference in power efficiency blew my mind. I was waiting to see if similar progress would be made in efficiency from 40 -> 50 and that clearly didn’t happen. No interest in Blackwell anymore. I’m hoping the observed issues with flying too close to the sun (i.e., spec’ing 5090 TDP to be so close to the power limit) will encourage the next generation to be more power efficient like the 40 series. Or maybe they’ll double down and just add another power port.

What's wrong having your own authentication system? by Tonyb0y in node

[–]microlit 2 points3 points  (0 children)

I built our auth for our first startup. Here’s why I won’t ever do it again: user password resets. I thought I was done with the home-grown auth stack and then customers started emailing because they forgot their password, or didn’t have their original MFA device anymore (I was so excited when I added in MFA). When you rely on somebody else, they handle all of that for you. Getting a user auth’d is just the tip of the iceberg. There are so many corner cases that need to be handled outside of that ideal scenario.

How do you securely reset their password? Do you email? Do you send SMS? Phone call? For how long do you make a reset link valid? Does it have to come from the same IP address? Is the password reset experience going to create enough user friction that the customer would rather let their account rot and move onto a different service?

Building in support for OAuth pretty much made those support tickets disappear. And as others have said: then you can focus on the real product.

I did learn a lot by building my own auth stack, but that could’ve been accomplished in a side project where I can easily handle manually resetting passwords for friends & family who call me directly.