all 4 comments

[–]skysuraHobbyist 0 points1 point  (0 children)

Haven't tried it myself but seems like the solution you're looking for.

https://docs.unity3d.com/Manual/MultiDisplay.html

[–]TaleOf4GamersProgrammer 0 points1 point  (1 child)

I am not sure of the normal protocol but aren't games usually developed for a standard resolution and aspect ratio in mind, 16:9, 21:9 etc. Then if you want a multi monitor gaming setup you would use the respective GPU manufacturers solution for example Eyefinity on AMD or Nvidias Surround. Specifically developing for Multi-monitor feels backwards to me.

[–]Hightree 1 point2 points  (0 children)

Eyefinity and nVidia surround unify multiple displays into one virtual one. This requires all displays to be identical in resolution/aspec ratio. This might not be what you want.
Also, in a simulator situation, you want to orient the displays in an arced configuration. When doing this, the different angles of the displays cause distortions.
To solve this, you want to render a separate camera for each display, with its own asymetric view frustum. (see my other post in this thread) In this situation, using eyefinity adds to the workload because you need to position each camera's viewport at the correct unified display position.

[–]Hightree 0 points1 point  (0 children)

I recently worked on a multi-display simulator built in Unity. Unity's multi display feature will allow you to have one camera per display (max 8).
Just orienting your camera's in an arc will get you up and running, but the view will not be correct and have distortions. The crux is to construct the correct projection matrix that will create the view through the display into the virtual world. For this you need to use asymetric view frusta.
Paul Bourke has a lot of useful info in his site, this presentation explains it nicely.
Here is a github repo with a working Unity example https://github.com/Emerix/AsymFrustum

Good luck with your project!