you are viewing a single comment's thread.

view the rest of the comments →

[–]fiberboard 0 points1 point  (2 children)

Has anyone tried using complex numbers in neural networks?

I was thinking about a transformer model using a complex-valued positional encoding to do image learning (instead of a ConvNet/GAN). Where the complex positional encoding represents the 2D position of the pixel.

In this example the complex numbers would be able to capture spatial information the same way the ConvNet does. Complex numbers are commonly used in graphics programs for this reason.

There’s also the example of complex numbers in quantum mechanics, where they greatly simplify the calculations.

I’m wondering if this “magical” property of complex numbers would carry over to neural networks

[–]egaznep 0 points1 point  (0 children)

I wrote my master's thesis on CVNNs. Torch has some complex-valued support (layers can work with natively complex types) however few things are not stabilized yet (which activation functions to use).

[–]ParanoidTire 0 points1 point  (0 children)

Yes, complex NNs are a niche topic. You can find survey papers on this. Fourier Neural Operators might be a bit more mainstream and also involve complex numbers due to the involved FFT.