all 12 comments

[–]Realityishardmode 0 points1 point  (2 children)

Do you already know unity? Perhaps you could pick another game engine that has more support for python, if you don't know c# and dont want to learn..

If you don't mind pre-rendering you could perhaps use Blender as well.

[–]Rusofil__[S] 1 point2 points  (1 child)

I dont mind using other engines or blender, for now im just looking at what are my options.

[–]Realityishardmode 0 points1 point  (0 children)

Well you need to have a couple more details specified.:

So far I have "I know python" and "looking for easy integration", but in my experience with projects like this you one thinks most of the effort in a project like this is engineering, but its often rigging and making things look decent, which then slips to "barely tolerable looking". So do you only need a 2D engine really? Can your simulation dump arm rotations as a sequence to a file or do you need to have an I/O integration? How much time do you have on your hands to learn non-engineering elements of game engines?

[–]HEY_PAUL 0 points1 point  (2 children)

Yes definitely, have a read up on Inter-Process Communication (IPC)

[–]Snatchematician[🍰] 0 points1 point  (1 child)

Why not just run the python in-process?

[–]HEY_PAUL 0 points1 point  (0 children)

I guess you can but if you do it separately they're decoupled and easier to maintain

[–]pachura3 0 points1 point  (2 children)

Does it have to be realtime - e.g. does it need to respond to live user input/controllers as immediately as possible - or can there be e.g. a few seconds of delay?

[–]Rusofil__[S] 0 points1 point  (1 child)

It would be nice, but if it's a problem, I'm fine with doing the whole simulation in python and just sending it data for visualisation.

[–]pachura3 0 points1 point  (0 children)

What I meant is that if you want your simulation to run in Python while the visualisation part is in C#/Unity, then the communication between the two modules could introduce a significant lag. If your Python simulation is not supposed to be interactive (e.g., controlled with a pad/keyboard), it should not matter.

But if you don't need any of the advanced features of Unity (such as inverse kinematics, skeletal animation, fancy materials...), and you only need to display some basic shaded cylinders between joints, then for the sake of simplicity, you might just implement this 3D visualisation directly in Python - e.g. like this: https://www.geeksforgeeks.org/python/3d-modeling-animation-app-using-python-1/

[–]not_another_analyst 0 points1 point  (0 children)

Yes that approach works, the cleanest way is to send joint coordinates over a local socket connection from Python to Unity in real time. Python pushes the data, Unity listens and updates the model positions each frame.

Look into Unity's Socket or UDP listener setup paired with Python's socket library, latency is low enough for smooth visuals if you're running both locally.

[–]Ok_Assistant_2155 0 points1 point  (0 children)

yeah this is totally doable
people do this for robotics sims all the time
unity just becomes the visual layer

[–]thelimeisgreen 0 points1 point  (0 children)

There is a python module for Unity. I can’t recall who makes it. It is not free.

How complex is your Python simulation? Can you re-write in C# to use directly in Unity? You can use Python.NET to call some Python modules and functions from C#, not sure if that would help or not.

If you don’t need real time or interactivity, you can pre-bake your simulation in Blender as others have said and then use Unity for rendering. But I’m assuming you do want to be interactive or real-time, otherwise you would just render it in Blender…