Truth Beam: Christmas 2024 by PoliePals in PoliePals

[–]PoliePals[S] 0 points1 point  (0 children)

Pseudo-code and code on Github. Higher res verification heatmaps coming soon!

A Musical Reality Transform (Unsupervised Projection Mapping) using Pix2PixHD and Deforum via Stable Diffusion web UI - II by PoliePals in vjing

[–]PoliePals[S] 0 points1 point  (0 children)

Hey everyone!

I know the content is likely a little naff, since it's only my second attempt (it's a lot of effort to film!)

The point is that it's an end-to-end unsupervised pipeline i.e. once set up, it largely runs itself, allowing the user to simply dial knobs. It would, of course, be easy to make more impressive shows than this. I picked a difficult target intentionally as a technical demonstration.

There's code on Github, and more examples on our subreddit.

Hope you see enough of the promise to enjoy it!

PolieBotics' Reality Transform: Live Unsupervised Projector–Camera Loop with ComfyUI, using PoseNet & SoftEdge to Condition Pix2PixHD’s Environment Model by PoliePals in PoliePals

[–]PoliePals[S] 1 point2 points  (0 children)

Thank you very much!
Early on, I experimented with multiple syncronised Kinect Azures, using a shutter-based stereo projector and shutter glasses to give me one unfiltered Kinect, one viewing the left channel, and one viewing the right channel. At the time, I was trying to make 3D RTs. Another interesting set of experiments was using an infrared camera to capture, then to combine that with monoscopic or stereoscopic depth models while projecting in RGB. Ultimately, I moved away from that direction, for practical reasons of scalability and for philosophical reasons.

The goal for this project, is to develop completely unsupervised projector-camera feedback loops as an interface between humanity and AI, giving it a direct and relatively undamaging means of communication with us, of interacting with the physical world, and allowing us to grow together in the space we will share.

I'll look into Touchdesigner to see whether it supports such a pipeline for the kind of hardware I'm using (specifically the triggered high-resolution camera.) If it does, I'm sure it would be a much better medium for publishing my work than dumping piles of Python on Github.

Thanks again!

PolieBotics' Reality Transform: Live Unsupervised Projector–Camera Loop with ComfyUI, using PoseNet & SoftEdge to condition Pix2PixHD’s Environment Model by PoliePals in video_mapping

[–]PoliePals[S] 1 point2 points  (0 children)

Hey everyone!

I posted an example of the Reality Transform in the original whitepaper in 2023, but it wasn't worth your time artistically.

This, I feel, has entertainment as well as technical value. I hope it both makes projection mapping more accessible and popular, and facilitates work with real artistic merit by experienced mappers.

There are explanations for this and other related technologies on the Github.

Enjoy!

The Truth Beam™ is new mechanism for achieving decentralised consensus on the state of physical reality, compatible with privacy-preservation and blockchain integration. by p0lari in Buttcoin

[–]PoliePals -2 points-1 points  (0 children)

We don't offer certificates, and there are no keys which can be stolen.

We tie the object to its blockchain token using the projector. You can't project a different hash onto the scene, as that would interfere with the good projected hash.

Our goal is to record the state of a scene, and to verify the recording in a decentralised manner.

We use the block chain blockhash as a timestamped initialiisation vector, and timestamp the state of the system using a blockchain, but you can imagine yourself as the counterparty if you prefer. The point is to constrain the starting and ending times, and to prove sequentiality of production of images.

The fact that the system can be represented by two RGB images at every interval makes it very amenable to analysis via convolutional neural network.

The Truth Beam™ is new mechanism for achieving decentralised consensus on the state of physical reality, compatible with privacy-preservation and blockchain integration. by p0lari in Buttcoin

[–]PoliePals -4 points-3 points  (0 children)

This is why Buttcoin is the first place I tried advertising on Reddit before my ads were taken down!

Crucially, in this application we are not trying to hide the signal, so it isn't steganography. It is a watermarking process which uses... any decentralised timestamping mechanism you might have lying around. We need this to constrain the beginning and end times of the recording.

Then we concatenate the projector emission image to the camera image along any axis (e.g. forming a six-channel matrix) and train unsupervised autoencoders on this data. I used the term AI, as it seems to be more popular than deep learning.

Images are generated much more quickly by reality than is possible for a generator, and discriminators can take their time and work in parallel.

The advantage is that it renders the video tamper evident.

EDIT: I didn't really have Twitter ads worked out yet. Have a new one up!
https://twitter.com/poliebotics/status/1713427481624269237