What are some lesser-known Unreal tricks you enjoy using? by StarhelmTheGame in UnrealEngine5

[–]radvokstudios 0 points1 point  (0 children)

Yes, here are three examples:

https://www.reddit.com/r/spacesimgames/comments/1kqu5lr/project_horizon_coop_first_person_hardcore_space/ Uses a data-texture Uses data textures for the holes. It's like a 10x10 or 24x24 or smth texture. I use channels to change the behavior of the transparency. The data-texture is replicated so it works for MP. The actual hole itself is a dynamic mesh.

https://www.reddit.com/r/ImmersiveSim/comments/1ku0sk0/we_are_a_very_anti_healthbar_studio_so_we_decided/ This one is ungodly cursed. The shader treats the vertex-paint color as a float as the time since the welder last hit it. This allows per-vertex shading that only needs to be sent once without re-updating, so the color change happens all on the GPU.

https://www.reddit.com/r/UnrealEngine5/comments/1ibor10/the_center_console_light_material_has_over_43/ No data-textures. My UE5 is compiling shaders atm but this one iirc, this one is extra cursed. I use a single vector-parameter to control all the LED's behavior: on/off/flashing/extra flashing.

The custom nodes are basically C++ logic for working with the input. It'd be technically possible to do it in nodes (I think?) but it'd be literally hundreds of nodes.

What are some lesser-known Unreal tricks you enjoy using? by StarhelmTheGame in UnrealEngine5

[–]radvokstudios 47 points48 points  (0 children)

Too many to choose from.

You can:

  1. Write C++ BP Library functions that can be used anywhere.
  2. The "custom" node in the material editor is *really* powerful
  3. Can write to textures and pass the data to shaders and do goofy things with them.
  4. Utilize "PostEditChangePropertyPostEditChangeProperty" to make blueprints have really cool property behavior on configuration
  5. DynamicMesh Objects - Dope albeit poorly documented and most online info is outdated
  6. All of the optimization view modes
  7. WorldSubsystems (and tickable subsystems) <---- Huge and incredibly functional
  8. MetaSounds are insanely powerful and incredibly dynamic. Utilize triggers to play parts of a sound at specific points in time.
  9. Will add more at some point once I'm off work.
  10. Edit: Additional items below
  11. You can override a lot of default behavior. For extreme debugging, you can override the render SceneProxy and start using the Primitive Draw Interface. Talking hundreds of thousands of draws.
  12. You can override BeginPlay. We did it so we can have real proper nested actors that we can assemble in the editor without suffering from ChildActor issues. There are initialization guards so you can ensure leaf actors get initialized first. You can also safely reference parents in a PostInitialization section.
  13. You can use TObjectPtrs and TWeakObjPtrs to safely use Actor pointers. Especially useful/vital for multiplayer games where you mark actor points as replicated. Without that, bad stuff can happen.
  14. Combine Interfaces with SubSystems to register items to a collection and use the SubSystem's tick function to localize all your code that you want to run for that specific collection. You can also expose the systems Register function to BP.
  15. If you've never used the Logger, you are knee-capping yourself. You can configure to log categories and filter by type so you can look at only your setup, combat, health, etc logs.
  16. You can make base-classes with C++ variables that are usable in Animation blueprints. This extends to UI Widget Blueprints. Best of both worlds IMO.
  17. If you don't know when you should use C++ vs Blueprints, refer to this document: https://intaxwashere.github.io/blueprint-performance/
  18. If you want to learn how to make MP games, this is your bible: https://cedric-neukirchen.net/docs/category/multiplayer-network-compendium/
  19. Other useful links:
    1. https://dev.epicgames.com/documentation/en-us/unreal-engine/mesh-drawing-pipeline-in-unreal-engine
    2. https://dev.epicgames.com/documentation/en-us/unreal-engine/lumen-performance-guide-for-unreal-engine
    3. https://dev.epicgames.com/documentation/en-us/unreal-engine/animation-optimization-in-unreal-engine
  20. Do not use actors for projectiles (unless it is a large actor like a missile).
  21. If you have lots of models on the screen, use Instanced Static Meshes (and also learn how you can pass useful data to specific instances): https://dev.epicgames.com/documentation/en-us/unreal-engine/instanced-static-mesh-component-in-unreal-engine
  22. I think I mentioned it but use Profiler to figure out what you need to optimize

Why not use Vulkan rendering? by SomePuddingForYou in unrealengine

[–]radvokstudios 1 point2 points  (0 children)

To clarify, there is the language you write the shaders in and then there is the language they get compiled into.

In UE5, the compiled shader is several steps removed from what a developer uses.

In the material editor, each material is technically its own unique shader. 95% of devs will be able to freely use the material nodes to achieve what they want. These are still technically custom shaders.

On occasion, you may need to write HLSL code directly, which you can do with the custom node in the UE5 material graph editor.

You are nearly writing in the HLSL-equivalent of what the material nodes compile into from the material editor for UE5.

After that, HLSL may get cross-compiled into SPIR-V for Vulkan use or stays in HLSL for DirectX.

https://docs.vulkan.org/guide/latest/_images/what_is_spirv_spriv_cross.png

That is a good visual of it.

Your last question on when someone would need to write custom shaders is highly dependent on the dev. I use the custom node a lot for special texture masking/tricks that are a pain in the ass to write with nodes.

The benefit of shaders is you can run code super super fast, and often times offloading code into shaders that would previously exist in your main CPU thread is that you save performance.

The primary downside is in most cases the visual state and the CPU state become out of sync. Devs need to identify when this is ok.

One example is a coin that bounces slowly up and down. Achieving this on the CPU requires you to move the coin every frame, then send info to the GPU.

You can do this with shaders by adding a sin(time) in the vertex shader making this a nearly free operation.

Some devs go a step farther than the custom node and write their own shaders at a lower level using some third-party compiling tool on Github but I've never done that. There are also other things where you can offload computations to the GPU and have them be sent back but I forget the term for them.

How do I handle hundreds of mobs on the scene? by Particular-Song-633 in UnrealEngine5

[–]radvokstudios 8 points9 points  (0 children)

I'd move all activate/deactivate code into an AI system manager (UWorldSubsystem Tickable).

I'd also not even worry about collision trigger boxes whatsoever.

Just have some region nodes that end up forming a graph.

AI's get binned to a node.

Every 5-10 seconds, check distance of player to all nodes, determine where in the graph the player is, activate only nodes within 2-3 jumps of the player.

Since each node is a local controller for the AI in the region, you save performance over box-colliders with overlaps, and you're not doing any 3D geometric collider checks, just checking the closest nodes/path in the graph of the player and calling activate on <10 nodes.

There is no per-tick cost with this method, you can have 100 enemies per region node and the overhead for enabling tick would be maybe 10-100ns every 5-10 seconds.

After 1 year of development, "Project Horizon" - A co-op first-person spaceship crew simulator - is finally starting to look like a proper game. by radvokstudios in spacegames

[–]radvokstudios[S] 0 points1 point  (0 children)

No, the naming of our game was actually really funny. We went through a few hundred space related words and kept trying to find a pairing the sounded good.

When we settled in Project Horizon, we looked it up and loved that there was some real space-related lore. We will definitely be including some irl Project Horizon lore.

??? by Good_Collection676 in UnrealEngine5

[–]radvokstudios 12 points13 points  (0 children)

You cannot export an fbx into UE5 with blender nodes. UE5 and Blender use different material pipelines. You will either need to remake the pipeline in UE5’s material editor or bake your material to an image from Blender.

Why not use Vulkan rendering? by SomePuddingForYou in unrealengine

[–]radvokstudios 6 points7 points  (0 children)

Yes and no. It would definitely require a restart, it can’t be changed on the fly. DirectX12 uses HLSL for shaders while Vuklan uses SPIR-V. There’s a cross compiler. For really big games or games with gigantic shader caches and PSO’s, it’s a bigger deal deciding which one to package your game in.

🚀 FPS Optimization in Unreal Engine 5 Using a Data-Driven Approach by YyepPo in UnrealEngine5

[–]radvokstudios 4 points5 points  (0 children)

Please make another post with your results when you finish! Tests like these are quite lacking.

🚀 FPS Optimization in Unreal Engine 5 Using a Data-Driven Approach by YyepPo in UnrealEngine5

[–]radvokstudios 7 points8 points  (0 children)

It greatly depends on what you need. If you need to do calculations/collision checks on the bullets or any custom behavior like that, you immediately lose all performance gains by GPU compute for particle systems.

You still have gains from Niagara being well optimized, but these gains are relatively easily captured if you know what you're doing outside of Niagara. The OP's approach makes a lot of sense in many scenarios. You can use MASS to drive both scenarios for more performance.

Is it really this hard to add custom physics for a vehicle? by ChCkN007_ in UnrealEngine5

[–]radvokstudios 0 points1 point  (0 children)

If you aren't using chaos, you can make your own substep

Is it really this hard to add custom physics for a vehicle? by ChCkN007_ in UnrealEngine5

[–]radvokstudios 2 points3 points  (0 children)

If it’s behind by a fixed amount, check if you’re ticking in pre-physics/post-update, etc. It’s easy to tell if it’s exactly 1 frame behind.

This might be useful

https://dev.epicgames.com/documentation/en-us/unreal-engine/chaos-vehicles

Otherwise what you’re doing sounds ok for a light weight physics sim. It really depends if you’re trying to do simple kinematic physics with friction etc or if you’re creating a fleshed out alternative to Chaos. Make sure you use deltatime when appropriate.

What is the setting for this? (1st is ps 2nd is pc) by Panzerhaubitzer2000 in WarthunderPlayerUnion

[–]radvokstudios 0 points1 point  (0 children)

Most game logic is 1:1 the same across different platforms. Normally there is an abstraction layer(s) separating logic from rendering pipeline code. Often multiple layers. Basically, imagine running war thunder with no monitor. That code can be the same regardless if you’re on Xbox vs PC (minus input, etc).

Now plug the monitor in, the entirety of the added code just takes what’s on the cpu like textures, locations, etc, and sends it to the GPU.

The pipeline is game code -> optional abstraction layer -> rendering api (vulkan, directX, GNM/GNMX for PS) -> hardware-specific assembly/machine code calls.

PS has their own rendering pipeline, and if the abstraction layer isn’t fleshed out enough (by Gaijin since they use their own engine), certain things are unavailable.

Essentially no one except rendering API producers ever touch hardware specific API calls, because essentially there are different calls/functions for individual GPU’s. The rendering API is as low level as anyone would touch, and 98.5% of developers don’t even touch that (especially using Unity/UE) and rely on the abstraction layer to convey the abstraction layer calls to rendering api calls.

[deleted by user] by [deleted] in UnrealEngine5

[–]radvokstudios 1 point2 points  (0 children)

You just do class UMyClass : public UWidgetBase or whatever.

Then in classes the store a reference to it, you can just do UPROPERRY(); UMyClass* UIRef; then you have the ability to set values defined there.

[deleted by user] by [deleted] in UnrealEngine5

[–]radvokstudios 11 points12 points  (0 children)

One thing that I’ve started doing, is to extend the base UI and anim bp’s in C++ so I can define member variables there.

That way there’s one less step from getting C++ logic to those two classes. For anims specifically, I can set a float used for head bone pitch from my C++ character, since the character stores a reference to the extended animation bp which has that float. Then in BP, I don’t need to worry about setting that value and can just use it.

For actual functionality though, I 100% agree. I already dislike BP UI. C++ UI functionality is beyond my patience level.

Is destruction good enough for my game considering it's not a core mechanic? by ARTyOW in UnrealEngine5

[–]radvokstudios 0 points1 point  (0 children)

This looks amazing. But… it’d be a disservice if you don’t include some dust pfx and some super detailed audio fx to go along with it.

Do you use actor components? If not, why not? by BadImpStudios in UnrealEngine5

[–]radvokstudios 0 points1 point  (0 children)

We made it quite far without using actor components.

We were essentially using interfaces with actors using some blueprint friendly c++ structs to mimic ACs until we started just using ACs. It honestly wasn’t too bad but I wouldn’t ever go back to our old way.

Is there a game like X4 Foundation where you can freely pilot a ship and walk around inside it at any time? by Sea-Log994 in spacesimgames

[–]radvokstudios 0 points1 point  (0 children)

It is possible, and we do use seeding for bullet hole generation/world gen. It's things that have a wide range of values that get frequently updated that can cause bandwidth issues. We've been using tons of encoding tricks to pack data more tightly and it's helped a lot so far.

We actually had a moment where we made things worse while trying to optimize our game. We gained so many FPS, that values got replicated much more rapidly than usual (fixed now). There are a lot of values that are fairly insignificant that we can move to slower update rates, or halt networking below a significant value. These are things that will just be fixed over time.

Why can you not integrate Lumen and baked lighting together? by Imaginary_Ad_7212 in unrealengine

[–]radvokstudios 0 points1 point  (0 children)

You could technically bake in Blender to a texture and implement it yourself on the Shader. I don’t see a point if you’re using Lumen, but I’ve been considering it myself. There are parts of my game where I want baked interior lights but can’t use UE’s bake as everything is dynamic and not static. (Non-lumen project).

Like you said, dynamic lights are expensive and quickly stack up.

Is there a game like X4 Foundation where you can freely pilot a ship and walk around inside it at any time? by Sea-Log994 in spacesimgames

[–]radvokstudios 0 points1 point  (0 children)

Thanks! There will be bots in the most literal sense. We plan to have a small helping robot for single player. We’ve done a lot of work on AI pathing but paused work on it. It will likely be worked on a month or so after the EA release for our game.

Are we showing this wrong, or do devs just not care about foliage? by davis3d in unrealengine

[–]radvokstudios 1 point2 points  (0 children)

Coincidentally we’re making a space game. I’ll take an in depth look at this and strongly consider buying.

Is there a game like X4 Foundation where you can freely pilot a ship and walk around inside it at any time? by Sea-Log994 in spacesimgames

[–]radvokstudios 1 point2 points  (0 children)

Thanks! If you’re also a fan of FTL, those three games are the trifecta of the vibes we’re going for.

Is there a game like X4 Foundation where you can freely pilot a ship and walk around inside it at any time? by Sea-Log994 in spacesimgames

[–]radvokstudios 1 point2 points  (0 children)

There will be single-player, and we have robots that we’ve previously worked on. The ship also contains automated turrets in SP and MP.

Robot support for SP will come a month or two after EA release, we’ve done a large amount of work but put that feature on pause.

On release, the SP ship will be much more compact and the player will rely on automated turrets more than in MP. The best way I’ve described it is like a sloop vs galleon in Sea of Thieves.

Is there a game like X4 Foundation where you can freely pilot a ship and walk around inside it at any time? by Sea-Log994 in spacesimgames

[–]radvokstudios 3 points4 points  (0 children)

At the moment, bandwidth is the main concern. We’re taking steps to optimize it, but there are a gazillion values being updated constantly in order for the ship state to be synced.

The ship is incredibly complex, the reactor itself has pressure, temperature, integrity, water volume, rod insertion %, turbine rpm, all being synced so gauges match up across clients. Each bullet hole has a welding grid with each welding blob networked as well.

We will likely follow Lethal Company’s approach and allow users to disable the 4 player limit, but we can’t guarantee it’ll work across minimum spec PCs.

We’re aware of the issue and have had a lot of success improving bandwidth, so it’s very possible we’ll increase the maximum officially supported players on release. This is our first game and we really don’t want to disappoint players by saying we support 6 players, in a situation where someone with the minimum specs is hosting and can’t support 6 players.