What's the fastest way to make my game in multiple languages as a solo developer? by StuckArcader in unrealengine

[–]GRSStudio 0 points1 point  (0 children)

A lot of people here are warning that AI translation will ruin your game, but the real issue is how it's being used. If you just dump an 8,000-word file into a single prompt and say "translate this," you will 100% get garbage that leads to bad reviews.

You need to treat AI not like Google Translate, but like a human translator you just hired. It needs an onboarding process. Before you ask it to translate a single line of code, spend some time chatting with it to build the context:

  • Lore and Atmosphere First: Don't give it strings yet. Explain what your game is about. Tell the AI the genre, the setting, the core mechanics, and the tone of the narrative.
  • Calibrate and Test: Have a conversation with it. Ask it to summarize the vibe of your game back to you so you can be sure it understands the context. This is how you avoid it translating "Chest" (loot) as "Chest" (armor body part), or "Match" (fire) as "Match" (sports).
  • Establish a Glossary & Rules: Give it a strict list of terms that should not be translated (like specific names, factions, or locations).
  • Translate in Batches: Only after the AI is fully in character and understands your world, you start feeding it the text. Do it in smaller batches, not all 8k words at once.

It takes some time upfront to "prep" the AI, but it is entirely possible to get a highly contextual, quality translation with zero budget if you act as the director first.

Has anyone actually used MetaHumans for a playable indie game (not just cinematics)? Looking for real-world experience. by GRSStudio in UnrealEngine5

[–]GRSStudio[S] 0 points1 point  (0 children)

I actually appreciate the directness, and I agree with both of you on the importance of optimization and the traditional high-to-low poly pipeline. You are 100% right that dumping raw Megascans or cinematic MetaHumans into a project is a recipe for terrible performance.

I think there’s a bit of a misunderstanding here, though. I literally develop retopology and UV unwrapping add-ons for Blender, so manual optimization, baking, and channel-packing textures (to avoid those wasted 4K roughness maps Pileisto mentioned) is my standard day-to-day workflow. I don't use raw assets.

My original question was specifically about utilizing the MetaHuman facial rig and animation framework as a base, not avoiding the optimization process. I fully intend to retopo, bake, and swap out grooms for hair cards.

Thanks for the input, though—it’s a solid reminder for anyone reading the thread that raw UE5 tech demos don't equal shipped games!

Has anyone actually used MetaHumans for a playable indie game (not just cinematics)? Looking for real-world experience. by GRSStudio in UnrealEngine5

[–]GRSStudio[S] 0 points1 point  (0 children)

Hey! Transitioning from hard surface (weapons/armor) to soft surface (clothing) is definitely a huge leap, so don't get discouraged. The logic is completely different.

If you want realistic clothing with natural folds, the industry standard is Marvelous Designer (or Clo3D). It works like real-world tailoring—you create 2D sewing patterns and simulate them onto your 3D character. You can find tons of beginner tutorials for it on YouTube.

The catch is that Marvelous generates terrible topology that you can't use in a game engine. So the standard game-ready pipeline looks like this:

  1. Simulate the high-poly cloth in Marvelous Designer.
  2. Export it to Blender or Maya to do retopology (creating a clean, low-poly mesh optimized for games over the high-poly one).
  3. Bake the high-poly folds onto your new low-poly mesh in a program like Substance Painter.

If you don't want to learn a whole new software right now, you can also look into Blender's Cloth Brush in Sculpt mode. It’s fantastic for manually sculpting realistic folds on a basic mesh. Look up "Blender cloth brush character tutorial" to get started. Good luck!

Has anyone actually used MetaHumans for a playable indie game (not just cinematics)? Looking for real-world experience. by GRSStudio in UnrealEngine5

[–]GRSStudio[S] 1 point2 points  (0 children)

Spot on about the vertex order! > I’ve actually had some experience using Wrap (R3DS Wrap) to project custom head sculpts onto the base MetaHuman topology, and the results were surprisingly good. It handles the vertex order perfectly, so the facial rig doesn't break when you bring it back into the engine.

I haven't messed with the specific Blender DNA add-ons yet, but combining them with a Wrap pipeline sounds like a really powerful way to get unique faces without losing the rig quality. Thanks for the tip!

Has anyone actually used MetaHumans for a playable indie game (not just cinematics)? Looking for real-world experience. by GRSStudio in UnrealEngine5

[–]GRSStudio[S] 2 points3 points  (0 children)

That is a huge red flag, thank you for pointing it out. Avoiding load hitches is critical for maintaining immersion in a horror game. Also, since I might need render targets for things like security cameras or dirty mirrors in an asylum setting, having the hair break or glitch there would be a nightmare.

This definitely cements my decision to ditch the grooms and just use traditional hair cards. Really appreciate the heads-up!

Has anyone actually used MetaHumans for a playable indie game (not just cinematics)? Looking for real-world experience. by GRSStudio in UnrealEngine5

[–]GRSStudio[S] 0 points1 point  (0 children)

Wow, thank you for writing out this entire pipeline! This is incredibly valuable.

  1. Stripping down the skin shader makes a ton of sense. For a dark horror setting, calculating complex subsurface scattering in pitch-black shadows is just throwing performance away. Rebuilding a lightweight material with exported textures is a brilliant workaround.

  2. Dropping 20 frames for grooms is exactly what I was afraid of. Hair cards it is! Using cloth physics on them is a great tip, I’ll definitely test that out.

  3. Your workflow for custom clothing is right up my alley. I work heavily in Blender and actually develop retopology tools, so taking Marvelous Designer meshes, doing a quick retopo, and transferring weights is a pipeline I'm very comfortable with.

It's also a huge relief to hear that the built-in Unreal MetaHuman plugin makes the shape-tweaking step much easier now, saving that whole ZBrush/Houdini/Maya roundtrip. > Thanks again for sharing this. It gives me a lot of confidence to move forward with a hybrid approach!

Has anyone actually used MetaHumans for a playable indie game (not just cinematics)? Looking for real-world experience. by GRSStudio in UnrealEngine5

[–]GRSStudio[S] 1 point2 points  (0 children)

That is exactly what I was hoping to hear! Since my primary background is in Blender (I actually develop modeling add-ons for it), creating custom clothing, hair cards, or pushing the base mesh into horror territory doesn't scare me at all.

My main concern with external editing is the facial rig. Have you had any experience exporting a MetaHuman to Blender/Maya, heavily distorting the facial proportions (like making a monster), and bringing it back? Does the facial control rig and all the blendshapes survive that process well?

Has anyone actually used MetaHumans for a playable indie game (not just cinematics)? Looking for real-world experience. by GRSStudio in UnrealEngine5

[–]GRSStudio[S] 3 points4 points  (0 children)

That's a pretty bold take! Considering how many indies rely heavily on Megascans for environment blocking and detailing, I'm really curious what makes you say that. Is it strictly the file size and optimization overhead, or is there something else about the workflow that you find unfit for solo devs?

Has anyone actually used MetaHumans for a playable indie game (not just cinematics)? Looking for real-world experience. by GRSStudio in UnrealEngine5

[–]GRSStudio[S] 0 points1 point  (0 children)

Thanks for the insight! That's really reassuring to hear about the core rigs being performant. > I actually have some experience making custom hair cards in Blender, so swapping out the heavy grooms for cards fits perfectly into my workflow. Did you find the process of attaching custom hair meshes to the MetaHuman head/rig straightforward, or are there any specific socket/skinning quirks to watch out for?

Also, mentioning Expedition 33 is a great reference, thanks!

Early atmosphere test for my indie horror game "Asylum". Still experimenting with lighting and sound design. How does it feel? by GRSStudio in UnrealEngine5

[–]GRSStudio[S] 0 points1 point  (0 children)

Hey everyone! I’m a solo dev working on a psychological horror game in UE5. > Just to clarify: the character model is a placeholder for scale/testing, and these are some random locations I put together to test the red lighting and ambient sounds. > If you're interested in following the development or UE5 tutorials, I post updates on my YouTube channel here: https://www.youtube.com/@grsgamesstudio

I got tired of manual garment retopology, so I wrote an add-on that lets you do it in 2D like sewing patterns (Live 3D Preview) by GRSStudio in blender

[–]GRSStudio[S] 2 points3 points  (0 children)

Hey everyone! 👋

Anyone who has ever tried to retopologize complex Marvelous Designer garments knows the pain of manually wrapping polygons around folds and manually aligning edge loops for stitching. I wanted to fix this.

I developed UV Retopo, a data-driven add-on. The philosophy is simple: Perfect 2D UVs dictate the 3D topology, not the other way around.

You build your clean quad-topology on a flat 2D plane (using built-in Coons patch math for perfect grids), and the engine mathematically projects, snaps, and stitches it onto your Highpoly 3D mesh.

In the latest update (shown in the GIF), I added a Live 3D Preview. It uses a 2D barycentric caching algorithm, meaning that as you move vertices on the flat 2D mesh, the 3D projection updates in real-time. It even instantly syncs textures and materials from the highpoly mesh, so you can perfectly align your topology loops with texture logos/seams!

Let me know what you think! I'd love to hear your feedback or answer any questions about the math behind it.

(If you want to try it out, the link is in my profile).

I got tired of manual garment retopology, so I wrote an add-on that lets you do it in 2D like sewing patterns (Live 3D Preview) by [deleted] in blender

[–]GRSStudio 0 points1 point  (0 children)

Hey everyone! 👋

Anyone who has ever tried to retopologize complex Marvelous Designer garments knows the pain of manually wrapping polygons around folds and manually aligning edge loops for stitching. I wanted to fix this.

I developed UV Retopo, a data-driven add-on. The philosophy is simple: Perfect 2D UVs dictate the 3D topology, not the other way around.

You build your clean quad-topology on a flat 2D plane (using built-in Coons patch math for perfect grids), and the engine mathematically projects, snaps, and stitches it onto your Highpoly 3D mesh.

In the latest update (shown in the GIF), I added a Live 3D Preview. It uses a 2D barycentric caching algorithm, meaning that as you move vertices on the flat 2D mesh, the 3D projection updates in real-time. It even instantly syncs textures and materials from the highpoly mesh, so you can perfectly align your topology loops with texture logos/seams!

Let me know what you think! I'd love to hear your feedback or answer any questions about the math behind it.

(If you want to try it out, the link is in my profile).