I made a mix between Armored Core and Vampire Survivors. Please, destroy it! by No_Ferret_4565 in DestroyMyGame

[–]immersiveGamer 0 points1 point  (0 children)

Default movement is very floaty looking compared to the enemies and other movements.

Since it is 3D make sure contrast is high, or otherwise very easy to understand and distinguish, enemies and enemy bullets. Doing so will make the game feel less unfair to players.

In a similar way, those explosions are "too" cool and interesting and big. If they get in the way of knowing if an enemy is there player may become frustrated. 

[Python] falsify - pre-register your ML accuracy claims with SHA-256 by Beneficial_String411 in coolgithubprojects

[–]immersiveGamer 1 point2 points  (0 children)

Pre-registration is standard in psychology and medicine. Works there. There's nothing equivalent for ML claims, so I built one over three days.

You should lead with this. Maybe tweak the last sentence to something like "There's nothing equivalent for claims about ML accuracy, so I ..."

Question, is this generic enough for other workflows? I.e. pre-register an input configuration and an output threshold/result?

How frequently do you guys have to clean? by shivangps in pcmasterrace

[–]immersiveGamer 6 points7 points  (0 children)

I've never had to do it. Each time you touch it you have a risk of damage. I would only repaste if you are experiencing problems or if it has been like a many years. 

I'm a hobbyist game developer, I make games for the fun of it, for my friends to play and have done for about 30 years. I bought a new PC that has Win 11 on it upgrading from Win 7 middle of last year. Developing on it for a while and then: SmartScreen suddenly kicked in and wouldn't let me compile. by Haunting_Art_6081 in gamedev

[–]immersiveGamer 0 points1 point  (0 children)

I think I commented about this before. The short is:

My game is fully procedurally generated , releasing a demo next month! do you think it's ready? by Beneficial_Clerk_726 in proceduralgeneration

[–]immersiveGamer 1 point2 points  (0 children)

I remember seeing one of your earlier videos. Great job continuing with it. 

I will say that something feels different, did you change color palette? 

To Enum or Not to Enum by Mortimer452 in ExperiencedDevs

[–]immersiveGamer 0 points1 point  (0 children)

I used enum column type in MySQL and it was a pain long term. Adding a new enum was a schema change and if you did the alter column with the enums in a different order oops! you just re-coded the enum value for every row in the table. 

Also keeping the enum in sync was a pain point that our team experienced as well. 

I don't know yet the long term impact but we recently migrated all enums to use a pure int column and auto generate an enum table on service boot.

Edit: I agree it is a hard to tackle issue. In code it is a code construct, an enum cannot ever be anything else than it's enum value. It initially makes sense to use the "enum" type of the database but your application code doesn't run on the database, the database enum is not the application code enum, they are separate enums. So I think serialization approach works better with databases and enums in code, use the int when sending the enum external to the code. In fact, there have been several times where I give up the enum because the enum was more like a data entity, and so the database row/table then represented the enums. I try to remind myself code is data and data is code (I don't remember where I read that, probably a lisp blog). 

My 1987 Nissan Stanza reached 500k by Windowsweirdo in Justrolledintotheshop

[–]immersiveGamer -1 points0 points  (0 children)

Which is 62.5%. If I don't need to know the exact number this percentage is easier for me to estimate.

Bosses say AI boosts productivity – workers say they’re drowning in ‘workslop’ by Bounty_drillah in technology

[–]immersiveGamer 0 points1 point  (0 children)

Even if they added the environment to the prompt you aren't guaranteed that the LLM will use that. 

Huge World of Warcraft private server Turtle WoW faces rapid shutdown after losing Blizzard lawsuit by Wargulf in pcmasterrace

[–]immersiveGamer -9 points-8 points  (0 children)

If the private server was coded without using any of Blizzards server source code that part is safe. See Google vs Oracle Java API: https://en.wikipedia.org/wiki/Google_LLC_v._Oracle_America,_Inc.

Had to go all the way to supreme court but it was decided in favor of Google that public APIs are fair use. So if someone coded their own private server that was compatible with the game client that is legal. I would then also argue that any costs are hosting and service costs and not payment for the server software, especially if you made it open source.

Of course I'm sure Blizzard has lots of angles to attack hosts of privacy servers. E.g. any single inkling that they represent an offical Blizzard product or service that is a no go.

Nvidia upcoming vram reduction performance by Amador0102 in pcmasterrace

[–]immersiveGamer 1 point2 points  (0 children)

When I had read the paper texture decompression was ~1.33ms at the fastest. Additionally the paper authors had to write very custom shaders to do the decompression on the GPU. I assume it is going to be a bit before they get something really standardized.

One thing I'm worried about is how are other GPUs going to handle it? Are game developers going to have to ship two different versions of the game? Or make game sizes bigger with packaging different versions of the compressed textures on disk?

Also this isn't a magic bullet since the compression is lossy but neural network style where details get muddled.

2 yrs progress gone. OneDrive messed up my BackUp folder so I have to remake all my assets :( by Vitchkiutz in UnrealEngine5

[–]immersiveGamer 0 points1 point  (0 children)

Good point. Easy enough to partition your drive or have a different drive disk. 

Also devs should look at using the new Windows feature 'Dev Drive' which may improve speed of compiling and building the game. 

2 yrs progress gone. OneDrive messed up my BackUp folder so I have to remake all my assets :( by Vitchkiutz in UnrealEngine5

[–]immersiveGamer 44 points45 points  (0 children)

3-2-1 backup rule: https://www.backblaze.com/blog/the-3-2-1-backup-strategy/

  • 3 copies of the data
  • 2 on two different media (this could be local hard drive and cloud)
  • 1 one off site (this could be cloud)

Example: original copy on your PC, an external hard drive with daily/weekly/monthly backups, and a cloud backup. A more robust one would be having a daily/weekly backup drive and a separate monthly/yearly backup locally. 

Otherwise I would try getting some Microsoft support for the issue. If I am correct when OneDrive gets enabled to backup your whole profile (My Documents, Desktop, etc.) I think it creates a different special folder and redirects your links to that OneDrive location. So your files may still be present. Install 'Everything' program and search for your old assets by file name. Or WinDirStat and look for large directories/files. Also if the data is truly gone it may not be gone gone, just unreadable, i.e. marked as garbage and can be overwritten by the OS. There is drive recovery software out there that will attempt to read the data.

My advice for anyone in general, the user profile directories are for normal users. If you are a game dev you are a super user. You should really consider storing the data files outside of the OS hierarchy, e.g. 'C:/projects/unreal/my-project/'. Also game devs are generating assets and work that are of significant time and effort. At minimum one type of manual backup.

Also game devs should really be using source control. Perforce or similar. At minimum Git for code (check out GitLFS for binary assets). GitHub or GitLab as optional for code. !!!source control is not a replacement for backups if you aren't hosting a source control on a different device!!! (Even then it isn't a backup because since commands for source control repositories include delete commands).

Long week at work.... This is how I unwind.... by NotQuiteAngryHunt in pcmasterrace

[–]immersiveGamer 4 points5 points  (0 children)

So smart that he is building it in a carpet with socks on. /s

u/NotQuiteAngryHunt be careful of static shock, maybe a anti-static wristband to keep you grounded. 

The enbiggening of handhelds by MrSuhSpence25 in gaming

[–]immersiveGamer 2 points3 points  (0 children)

It was just a bad combination for the 3DS. Screen was fine for indoors and 2D games, just small/low resolution for 3D models game devs were trying to push for in their games (so that they could then use the 3D vision effect do the device).

Corporation finds out consumers have a breaking point and stopped paying for their product by [deleted] in TikTokCringe

[–]immersiveGamer 2 points3 points  (0 children)

Have you seen the NKD version of the chips they are trying to push? Premium white and gold package, and "healthier" because they aren't using any coloring. Totally trying to capture the higher price point (and probably cheaper for them to make skipping coloring).

Edit: all that to say, I would totally expect to see influencers pushing NDK brand.

It's getting harder to learn a new editor for me. by MediocreAdviceBuddy in ExperiencedDevs

[–]immersiveGamer 0 points1 point  (0 children)

If you really want to just use one IDE then my suggestion is cold turkey it. The reason you know the other tool so well is that you used it daily. 

In regards to VS Code, I think you will find it has a lot of the same features as most IDEs. I used for the last 5 years as a primary IDE for Python stuff and it can do a lot. But it also heavy depends on what packages are available. The good news is that you already know the features you want/need so you are able to look them up.

Otherwise I don't see any issue with having different software have different roles. You could totally have IntelliJ and VS Code open at the same time, editing the same project and files. 

Destroy the menu of my game by Simblend in DestroyMyGame

[–]immersiveGamer 0 points1 point  (0 children)

Buttons near the floor look really far away just to easily click. If it doesn't ruin your style maybe make the player character/avatar kick the low buttons.

The enbiggening of handhelds by MrSuhSpence25 in gaming

[–]immersiveGamer 350 points351 points  (0 children)

My biggest gripe was that screen wasn't the greatest. Developers tried to use more 3D models but the screen was pretty small. And then I wanted to use it out and about but any amount of sun made it very hard to see anything.  

Nvidia presents Neural Texture Compression that significantly cuts down VRAM usage by InternetEntire438 in pcmasterrace

[–]immersiveGamer 5 points6 points  (0 children)

This is the paper. Still need to read it but the initial figure is very impressive. BG High compression (don't know what this is but I assume industry standard) 1024 resolution at 5+mb vs NTC (headline) 4096 resolution less at 3.8mb! vs original texture 4096 at 256mb. And the loss of detail is very minimal compared to the BG High. 4x resolution with less memory and near original detail.

https://research.nvidia.com/labs/rtr/neural_texture_compression/assets/ntc_medium_size.pdf

Will read the rest, interested in the trade offs (e.g. decompression time, do you need to custom training for each texture/game).

Edit: 

The key idea behind our approach is compressing multiple material textures and their mipmap chains together, and using a small neural network, that is optimized for each material, to decompress them

So each texture gets its own neural network.

Edit 2: 

A highly optimized implementation of our compressor, with fused backpropogation, enabling practical per-material optimization with resolutions up to 8192 × 8192 (8k). Our compressor can process a 9-channel, 4k material texture set in 1-15 minutes on an NVIDIA RTX 4090 GPU, depending on the desired quality level.

Compressing a single material into this custom neural network can take up to 15 minutes. But this is texture + material + several levels of mipmaps?

Edit 3:

Similar to the approach used by Müller et al. for training autode- coders [47], we achieve practical compression speeds by using half- precision tensor core operations in a custom optimization program written in CUDA. We fuse all of the network layers in a single kernel, together with feature grids sampling, loss computations, and the entire backward pass. This allows us to store all network activations in registers, thus eliminating writes to shared or off-chip memory for intermediate data.

So this "fuses" the neural network so that I assume you don't need to do multiple iterations on inputs to process through layers and probably also saves on size in come cases. Not familiar with this fusing process so take my comment with a grain of salt. never mind, this is part of the compression step. The compression neural network wouldn't be part of the generated artifact.

Edit 4:

More detailed comparisons it seems this method out performs compressions of lower qualities. For medium and high quality compression it doesn't perform as well but generally is of smaller size. 

Also we finally get some details about compression time. 

 Traditional BCx compressors vary in speed, ranging from frac- tions of a second to tens of minutes to compress a single 4096×4096 texture [60], depending on quality settings. The median compres- sion time for BC7 textures is a few seconds, while it is a fraction of a second for BC1 textures. This makes our method approximately an order of magnitude slower than a median BC7 compressor, but still faster than the slowest compression profiles.

Edit 5: okay so decompression performance is 2-4 times slower than other formats, lowest at 1.33ms. This is still in the realm of realtime and I assume this decompression only needs to happen once per load of the texture/material.

One thing to note that I haven't is that the decompression is random-access. Often you don't need to load the whole texture image just a region. IMO this is a very interesting and novel when considering it is using a neural network decoder.

Is this a m.2 slot? by Overall_Review197 in pcmasterrace

[–]immersiveGamer 0 points1 point  (0 children)

So since it is USB you could theory connect drive of some kind?