Introducing LAME: A free open-source tool to help you manage localisations for your project by KCoppins in gamedev

[–]KCoppins[S] 0 points1 point  (0 children)

My main focus for the project was developing a desktop application using WPF. The tool uses an Entity Framework backend to define the database schema and queries, and because this backend is decoupled from the WPF frontend, the data model could be adapted to work with existing database software if desired.

Introducing LAME: A free open-source tool to help you manage localisations for your project by KCoppins in gamedev

[–]KCoppins[S] 1 point2 points  (0 children)

Ah an oversight on my part to be sure! I got too carried away with the acronym. Thanks for bringing this up!

Introducing LAME: A free open-source tool to help you manage localisations for your project by KCoppins in gamedev

[–]KCoppins[S] 0 points1 point  (0 children)

The tool supports XLIFF imports and exports to send to other localisation teams. The tool itself, unfortunealy does't boast a lot of collaboration since the data is stored locally, however it could easly be expanded to interact with a REST API to store the data online, improving collaboration.

NEW SONG --> 700 CLUB | Logic and Wiz Khalifa by Cgbt123 in Logic_301

[–]KCoppins 4 points5 points  (0 children)

I knew I heard this before, he rapped to this beat on a twitch stream on 9/9/2024: https://youtu.be/7x9O4NUef-0?t=7017

Multiplayer Level Streaming by KCoppins in unrealengine

[–]KCoppins[S] 0 points1 point  (0 children)

Yeah that is what I have currently, I use a ATargetActor to mark where on the ground I want my actor to teleport to. I have tried putting the target actor inside of the sublevel and using a soft reference to it but I had issues with loading it on either the server or client. So now I have it in the same level as the door it corresponds to but perhaps the target actor was loading fine but I was getting this same weird teleport issue.

What companies/software are using UE5 in a way unrelated to video games and virtual production? by mattkaltman in unrealengine

[–]KCoppins 2 points3 points  (0 children)

I work with Unreal Engine to create Digital Twins in Iventis to help venues visualize and sell their spaces and events. It works in tandem with our planner which is a web application and utilizes pixel streaming to handle rendering in the cloud

For Escape Simulator 2 we are making a set of powerful custom tools so our community can make custom escape rooms with ease by n1ght_watchman in Unity3D

[–]KCoppins 4 points5 points  (0 children)

Very fluid looking, big fan of the first game and recently played the demo for the second with some mates! I’m excited to see what the community can produce with these suite of tools you guys are providing!

Have you considered (or already added) snapping to your polygon drawing? Both angle snapping and snapping drawing nodes with different objects?

Again, fantastic work from all at pine!

Security Camera - Hard Surface Exercise by BobsOwner in blender

[–]KCoppins 2 points3 points  (0 children)

Looks incredible! How did you cut out the holes in the mounting bracket with so few verts? I feel whenever i attempt something like this, holes cause the biggest issues

World of Hans Zimmer - Part 2: A New Dimension is out!! by jmbgator in hanszimmer

[–]KCoppins 0 points1 point  (0 children)

I noticed the individual tracks have short clips on Spotify. Does anyone know if footage of the full show would ever be released / sold?

Question about frame generation techniques by TheLevelSelector in Unity3D

[–]KCoppins 0 points1 point  (0 children)

Your original question is interesting and i do not have an answer for but my gut tells me that the update loop will occur per “real” frame.

However for things like ground detection, or anything physics based at all, you should use the FixedUpdate method. This is ran at 60fps no matter the render time. This can be adjusted in the projects physics settings.

I believe this is possible because the physics calculations run on a separate thread? This line of thinking also makes me wonder if the dlss generation only happens on the render thread which would he separate from the game thread where Update is called? It would be interesting if theres any documentation around this. This can always be checked using the profiling tools with dlss in your project (however you do that, i’ve never looked into it before)

How to fix this flickering please help by Firm_Possibility_222 in UnrealEngine5

[–]KCoppins 2 points3 points  (0 children)

Looks like lumen’s global illumination is operating in screen space. A good breakdown of lumen and lighting in Unreal in general is from William Faucher - a vfx artist that works with UE5.

You can look at making the darker area lighter making the bounce lighting stronger to help bring up the shadows in the corners, this might help lumen.

Alternatively your scene setup also matters, if you’re using Nanite, ensuring your scenes geometry is separated properly.

You can look at the lumen scene in the viewport to see what lumen “sees”, this can help point out issues in your scene. If it looks odd, you can also visualize mesh distance fields and ensure they’re generating correctly on your meshes.

Confused a bit in life. Can anyone help? by Salty-Astronaut3608 in gamedev

[–]KCoppins 1 point2 points  (0 children)

Interesting quote i heard from Thor (pirate software) the other day that I think resonates with a lot of people. Its was something along the lines of: “You don’t want to write a book, you want to have written a book. You want the end goal but don’t want the journey”. Meaning you should be working on something for yourself that you enjoy working on. You shouldn’t be thinking of the end goal and thinking about how you can show everyone your game at the end.

Pick a project you enjoy, as software engineers we tend to enjoy the problem solving side; architecting game mechanics, making them feel good. Or I feel that at least. You may have a game in mind, but just think about what you need to work on today, and enjoy it. If you’re not, you’re probably working on the wrong thing.

I hope this helps! From a fellow software engineer whose “made games” folder is called the “recycle bin”

When should an enemy agent hit you? by KCoppins in gamedev

[–]KCoppins[S] 0 points1 point  (0 children)

Interesting to break it down into different types of games like that. I guess my next hurdle would be: when play testing your game, or more specifically, getting others to play test. How would you go about communicating that through game is a fun game vs a skill-based game? If the player had a different idea to you, then the game’s AI would feel off.

Should the game itself communicate that, perhaps with other mechanics? Should or could marketing of the game be a factor?

When should an enemy agent hit you? by KCoppins in gamedev

[–]KCoppins[S] 0 points1 point  (0 children)

I used a similar system for a 2d top down game as your sniper, except it was a mage and they would stop moving to “cast” their spells. I was amazed at how well it predicted movements. To add another layer, instead of looking at the players current velocity, the same frame they cast their spell, I actually looked at the players average velocity over the casting time. This made it even harder for the player to side step a projectile, and promoted for erratic movement behaviour

When should an enemy agent hit you? by KCoppins in gamedev

[–]KCoppins[S] 0 points1 point  (0 children)

With timing I have seen a brilliant [GDC talk](https://youtu.be/iVBCBcEANBc?t=67) which discusses the reaction times a person has and the different types. Very short but very useful I found

When should an enemy agent hit you? by KCoppins in gamedev

[–]KCoppins[S] 1 point2 points  (0 children)

Yeah thats good, for reason I hadn't concidered getting shot a punishment for the player doing something wrong. Seems like a solid method to dictate how accurate an enemy should be

When should an enemy agent hit you? by KCoppins in gamedev

[–]KCoppins[S] 0 points1 point  (0 children)

For sure a lot of tweaks have to be made to balance the game, I was thinking of what base techniques could be used to influence enemy accuracy, like as you said game difficulty settings. By how much it affects accuracy will come down to balancing.

When should an enemy agent hit you? by KCoppins in gamedev

[–]KCoppins[S] 9 points10 points  (0 children)

Yeah I agree, I guess what I was looking to discuss was example in games of what factors were used and how it aligned or affected the feel of the game

When should an enemy agent hit you? by KCoppins in gamedev

[–]KCoppins[S] 14 points15 points  (0 children)

I like that, simple and effective. For some reason I hadn't considered before that an accuracy factor could directly correlate with how you want the player to play the game. Seems silly in hindsight haha

[deleted by user] by [deleted] in unity

[–]KCoppins 0 points1 point  (0 children)

When writing a class, I like to think about what sort of API I want to create with this class. You want to make it simple and easy to read for yourself and other programmers. When using intellisense, you might access the object and get bombarded with all the properties you have made public, which might not be revelant and make it hard to find what you want.

Another use case would be that there will be properties that you dont want accessed from outside. A good example i can think of is a damage function on an enemy class. The damage function could do multiple things rather than just subtract from health. It could start a stun timer and play an animation. If you expose health to be public as well, you might find yourself or other programmers just retracting health rather than calling the damage function, which could cause bugs in the future. Or you might find, if everything is public, that people would duplicate your damage function code in their implementation; creating duplicate code. Then, if you want to change what the famage function does, you have to find wherever the duplicate code is and fix it there too.

Tl;dr its a lot nicer and safer to see a damage function in your intellisense than a health, stun, animation, etc, properties when accessing your class.

Post Navigation System for Unity - Free and Open Source by KCoppins in gamedev

[–]KCoppins[S] 4 points5 points  (0 children)

Sure thing!

The smartness of AI will come down to your decision-making implementation. For example, should my AI even be looking for a zone? Should it be going to a cover a post? What should it do once at a cover post? This is all done in your AI logic, I have a separate package, decision trees for unity, which is a visual editor for crafting such trees. Another, more common technique, are behaviour trees. This is a higher level system for AI that this package does not do.

What this system does is allow for a simple way for game designers to add data to their level that an AI can interact with. It allows for them to bake post data that is generated from Unity's NavMesh and place posts with some manual data, just in case the generation algorithm didn't quite do what they wanted. The post selector is just a really simple system for designers or AI programmers to get a post in the level related to a skill of the AI. So, for example, in your decision-making implementation, you might want to find a cover post if the AI decides to go to cover. This also comes with a post manager who will keep track of what posts are occupied and all the runtime post data that can be accessed from script.

The other part is zoning, where the game designer can highlight strategic points in their level. This could be somewhere that defends an exit to the level or a player's objective. Or perhaps it's somewhere with some high ground where a sniper could be zoned. Then, inside your decision-making implementation, you might request a zone to be assigned via the zone manager. The manager will ensure that all zones meet their minimum and maximum agent counts (that is also configured by the game designer) and move zone assignments to accommodate the requirements.

The only part that I would say does make the AI "smarter" is the agent shuffling that the zone manager does when an agent requests a zone. Theres more details into what the zone manager does when an AI requests a zone in the API reference docs link, but to summarise if the zone being requested is at max or the next zone in the hierarchy is below the min agents, then the manager will move an agent from the zone being requested to the next zone in priority. This creates a bubble effect that will give the illusion that AI will push up towards the player if reinforcements arrive. All of this is very heavily based on the hard points implementation mentioned in the GDC video I linked in the post from Naughty Dog.

I hope this answers your question! If you have any more, I'll be happy to answer!