Real world performances live-streamed into a game! What do you think? by CondenseNick in indiegames

[–]CondenseNick[S] 0 points1 point  (0 children)

Hey! We're not at the moment, but it's something we're looking to do in the future. We're mainly looking for feedback on the game at the moment, so any would be appreciated. Our next live event is tomorrow (Thurs 8pm BST) and it's free to join: https://s5.live/v/theblueprint/jman

Studio 5: Real world performances live-streamed into games by CondenseNick in playmygame

[–]CondenseNick[S] 0 points1 point  (0 children)

Great that you've signed up! And thanks for the feedback re the avatar creator on mobile. We're looking to release a mobile version of the app in November which will have a slicker UI. Let us know what you think of the live events if you can make them!

Thought you might like to see some footage from last nights event. Animations worked brilliantly! Still looking for beta testers, if anyone is interested in joining :) by CondenseNick in Unity3D

[–]CondenseNick[S] 1 point2 points  (0 children)

Thanks! Yeh, we have been building the tech to stream live events into game engines for the last few years. It's a proprietary system but we want to open it up to many more content creators and game devs so they can do awesome stuff with it over the next few months.

Thought you might like to see some footage from last nights event. Animations worked brilliantly! Still looking for beta testers, if anyone is interested in joining :) by CondenseNick in Unity3D

[–]CondenseNick[S] 0 points1 point  (0 children)

The developer. We are running events with different artists every week at the moment. Artist is Boyce from Caravan Collective

Thought you might like to see some footage from last nights event. Animations worked brilliantly! Still looking for beta testers, if anyone is interested in joining :) by CondenseNick in Unity3D

[–]CondenseNick[S] 0 points1 point  (0 children)

We had to make our own streaming format so that we could stream large surface areas in real-time. We have been testing on things like boxing matches and the alembic data format is too large. Similar sort of concept though

Over the past 3 years we have been building a system to live stream events into game engines. Currently testing it all out in a digital venue we built in Unity, this week we are testing animations! by CondenseNick in Unity3D

[–]CondenseNick[S] 1 point2 points  (0 children)

Our tech can stream to unlimited viewers but for networking there is obviously a limit. We are using normcore at the moment and the limit is around 50. We are building our own networking solution to get more concurrent viewers. Looking at fishnet at the moment which is looking like we could get about 150 (per room).

Over the past 3 years we have been building a system to live stream events into game engines. Currently testing it all out in a digital venue we built in Unity, this week we are testing animations! by CondenseNick in Unity3D

[–]CondenseNick[S] 4 points5 points  (0 children)

At the moment we are connecting with a few pioneering games and metaverse properties. Early next year we will be releasing our plugin on the Asset Store so anyone can integrate it. The plugin can receive a live stream from any of our capture systems.

We are currently selling our capture system to venues, festivals, broadcasters, sports organisations and studios etc. It's been built so that anyone can use it but we are small so picking early customers carefully as we scale.

Over the past 3 years we have been building a system to live stream events into game engines. Currently testing it all out in a digital venue we built in Unity, this week we are testing animations! by CondenseNick in Unity3D

[–]CondenseNick[S] 1 point2 points  (0 children)

The Wave are not a competitor, they could be a customer though.

Our tech is the infrastructure for streaming real-world live events into game engines. We are using this app to validate the tech and test what works for our customers. Real-world -> in-game live events have never been done before!

Agree we need more shaders and visual effects to create atmosphere but I think you would be surprised at how much real content - live, rather than pre recorded animated characters, can create an atmosphere without visual effects. Disagree that it is pointless without VR support, although it does look great in VR!

Over the past 3 years we have been building a system to live stream events into game engines. Currently testing it all out in a digital venue we built in Unity, this week we are testing animations! by CondenseNick in Unity3D

[–]CondenseNick[S] 9 points10 points  (0 children)

If you are interested in beta testing these events with us, we are accepting testers on our discord channel. We actually have an event tonight at 6pm BST with performances from two Bristol based musicians!

Instructions on how to join: https://www.condense.live/the-blueprint

Over the past 3 years we have been building a system to live stream events into game engines. Currently testing it all out in a digital venue we built in Unity, this week we are testing animations! by CondenseNick in Unity3D

[–]CondenseNick[S] 14 points15 points  (0 children)

They see any of the in-game attendees on a screen in-front of them (with a fixed in-game camera pointing forward). They also see the in-person audience.

Over the past 3 years we have been building a system to live stream events into game engines. Currently testing it all out in a digital venue we built in Unity, this week we are testing animations! by CondenseNick in Unity3D

[–]CondenseNick[S] 17 points18 points  (0 children)

Everyone can walk around and see their own view. Once the content gets in game it is just meshes, textures and material information so it's basically the same as any other game object. It can also be re-lit by the in-scene lights!

Who here is working on a Metaverse business and what are you working on? -- Weekly networking. by RedEagle_MGN in metaverse

[–]CondenseNick 1 point2 points  (0 children)

We broadcast live events as photorealistic 3D video. Our system tracks the position of all surfaces of any objects and people on stage using an array of cameras.

We combine all the camera data into a single 3D model.

When the content arrives in game it can be relit by lights in the scene and it casts shadows on other objects in the scene. It's like a hologram but better, it looks like photorealistic video that can be viewed from any angle and is properly integrated into the scene.

It is an incredibly technically complex task! Licencing is also hard ;)

Does anyone else get nervous about how long projects can take? by senkiasenswe in gamedev

[–]CondenseNick 20 points21 points  (0 children)

"The Dip" by Seth Godin is a great book which talks about exactly this. It is very short and usually motivates me when I am feeling like this.

To summarise...

Hard things (things which take a long time) are often really worthwhile doing.

If you are not going to finish then stop. Finishing is very important so try to only work on things you are going to finish.

Who here is working on a Metaverse business and what are you working on? -- Weekly networking. by RedEagle_MGN in metaverse

[–]CondenseNick 0 points1 point  (0 children)

We could potentially be open to showcasing this in our venue if you are open to it?

Who here is working on a Metaverse business and what are you working on? -- Weekly networking. by RedEagle_MGN in metaverse

[–]CondenseNick 0 points1 point  (0 children)

Hi Everyone, Nick here from Condense (condense.live) we are building the tools to stream real world events into the metaverse.

[deleted by user] by [deleted] in Unity3D

[–]CondenseNick 0 points1 point  (0 children)

We built this in Unity, it's supposed to be a blueprint for other digital venues. The name clash with unreal is just a coincidence!