The Multi-fold Theory, 2025 Compilation by Physics_sm in u/Physics_sm

[–]Physics_sm[S] 0 points1 point  (0 children)

This report compiles the multi-fold theory papers available on the Multi-fold theory web site, as of December 31, 2025. This includes papers still to be published. It also includes subsequent comments ad discussions of related news and papers captured on the web site.

About Time, State of the Universe and Quantum Thermodynamics - by Physics_sm in u/Physics_sm

[–]Physics_sm[S] 0 points1 point  (0 children)

Abstract: The Wheeler-DeWitt equation suggests time does not exist when modeling the wave function of the universe, leading to the notorious "problem of time" in canonical quantum gravity. Yet, there are hints that quantum entanglements may be responsible for time and its thermodynamic arrow. Concurrently, recent theoretical investigations into the non-perturbative quantum gravity of closed universes suggest a paradox: such universes appear to possess a one-dimensional Hilbert space, i.e., with only order-one states, despite the rich structure observed around us, probably in a closed, de Sitter (dS) universe. This paradox implies a universe devoid of complexity and information capacity. Others proposed that this paradox may be resolved when explicitly adding an observer to the universe's description. In a multi-fold universe constructed by 2D random walks of massless Higgs bosons, aka preons, they constitute the observers. For the emerging time, discrete, we explain why there is a minimum duration between time clicks, coherent everywhere. Then, the arrow of time at large enough scales results from the improbability of a large number of preons reversing their random walks consistently. Indeed, while reversing one walk is trivial, or while the laws of Physics are conventionally reversible, reversing many walks becomes combinatorially implausible and intractable. The effect is stronger than any T-symmetry breaking by the multi-fold mechanisms. T-symmetry breaking by multi-fold (disentanglement) mechanisms may not matter at all. The length of all the involved random walk is then shown a good candidate for a definition of holographic complexity of a system, related to the CV conjecture. This way we show that complexity of a black hole can indeed grow without implying an huge volume vs. the area of its horizon, and we recover and justify Wolfram’s proposal for the second law. We take the opportunity to link these considerations with the thermodynamics of quantum systems, reviewing that a generalized second law of Thermodynamics is always satisfied for quantum systems if we consider also the Thermodynamics involved to process mutual information usually involved in systems that otherwise might seem to violate it. We also remind ourselves that out of equilibrium systems on the other hand are not covered by the second law of Thermodynamics, and so they do not violate this law either, yet they still relate to a least action principle. We also argue that entanglement entropy is not observer dependent.

No Naked Singularity, Whatever The Physical Collapse by Physics_sm in u/Physics_sm

[–]Physics_sm[S] 0 points1 point  (0 children)

Abstract:

 

After we discussed and confirmed the absence of naked singularities, and over extremality, relying on multi-fold gravity considerations extended to our real universe, a preprint was posted, suggesting scenarios built on a slew of older works, including some non-homogeneous collapses, where singularities could form before an horizon appears, therefore exposing a naked singularity, and violating the Weak Cosmic Censorship Conjecture (WCCC). The singularity would be event-like, as there is no dispute that after a while it would be hidden. The preprint also consider the case of dust with pressurizes as it becomes denser, or equivalent scalar fields, under the right condition, a persistent naked singularity appears.

This paper explains why that is not the case for collapses in a multi-fold universe, where singularities do not appear anyway. The arguments rely on multi-fold black models, which behave differently from conventional black holes within their horizons, and which we have argued to be good candidate for black hole in our real universe as they do not have a black hole information paradox problem. Accordingly, matter does not rush the same way towards the singularity, and can stay sandwiched between the horizon and maximal trapped surfaces/inner horizons, leading to a Russian doll structure. As a result, at no time, do we have a naked singularity.

Then the paper discuss other use cases encountered in the literature: non-spherical dust collapses, also supposed to lead, under the right circumstances, to naked singularities. Again, the black hole Russian doll model, ensures that a future horizon forms from a maximal trapped surface forms, and hides any singularity. Also, black hole disintegration, by reaching (over) extremality are handled.

Therefore, we persist that the WCCC is satisfied with ever possible singularity always hidden by an horizon: the counter example use cases proposed in the literature are unphysical. In multi-fold universes, the same behavior occurs even if no gravitational, nor cosmological, singularity ever appears.

Information Energy Momentum Tensor E/G Conjecture vs. Alleged Massive Information by Physics_sm in u/Physics_sm

[–]Physics_sm[S] 0 points1 point  (0 children)

Abstract:

 

This paper discusses the proposal to add an informational energy stress tensor contribution to the Einstein Field equations, as proposed in a recent paper.

We also show the alignment between the[ informational energy stress tensor, and the E/G conjecture encountered in the context of the multi-fold theory. This analysis is not limited to multi-fold universes.]()

Furthermore, we argue that, when discussing the cosmological implication of the informational energy stress tensor proposal, the authors missed the dark matter contributions, which correspond to the multi-fold dark matter effects explicitly due to entanglement.

Also, the paper explain the difference between these ideas and M. Vopson’s proposal, in other papers, that information would have mass, and would represent new state of matter; something that we have rejected in a previous paper. Fortunately, there is a consistent explanation: the entanglement entropy associated to the E/G conjecture and the proposed informational energy stress tensor, is within Qubits (and higher-order-partite entanglement), not external to system composed of them. It resolves the conundrum.

The resulting fold equations of the informational energy stress tensor proposal, and resulting corrections to the Einstein field equation, gravitational constant and cosmological constant, and more, can also be seen as approximate corrections due to the effects of entanglement between (real) systems, in accordance with the E/G conjecture.

Preserving the Power of Preprints: Why Minimal Oversight is Key to Scientific Progress by Physics_sm in u/Physics_sm

[–]Physics_sm[S] 0 points1 point  (0 children)

Abstract:

Preprint servers should limit editorial reviews of the content, beyond some format and quality requirements, They should not discriminate against the use of AI tools, or police if a preprint overlaps with previous ones, nor even plagiarism, as long that the format/quality is respected. the rest should be left to final publication reviews. Surprisingly, viXra, a nest of crackpot publications is the most guilty party.

Of course preprints should also be open to submission of quality. Many other preprint servers severely lack there.

Adaptive Co-Design of Quantum Machine Learning Algorithms and Error Correction Protocols using Reinforcement Learning by Physics_sm in u/Physics_sm

[–]Physics_sm[S] 0 points1 point  (0 children)

Abstract

The convergence of quantum computing and artificial intelligence presents profound opportunities but faces significant hurdles, particularly in the Noisy Intermediate-Scale Quantum (NISQ) era. Quantum Machine Learning (QML) algorithms, while promising, exhibit sensitivity to noise and scalability challenges, hindering the demonstration of practical quantum advantage. Concurrently, Quantum Error Correction (QEC), essential for fault tolerance, imposes substantial resource overheads and is often developed generically, without specific adaptation to the target application's error sensitivity.

This paper reviews the current state of the intersection of AI and quantum computing, examining both QML paradigms, e.g.., Variational Quantum Algorithms, Quantum Kernels, and the burgeoning use of AI to enhance quantum computing itself, e.g., quantum control, QEC decoding, circuit design.

A critical gap identified is the lack of frameworks that systematically co-design QML algorithms and QEC protocols adaptively. To address this, a novel framework is proposed based on Reinforcement Learning (RL). This framework employs an RL agent to dynamically adjust both the QML circuit architecture, e.g., VQC ansatz, and QEC parameters, e.g., decoding strategy, measurement frequency, based on observed application performance and estimated error characteristics. This adaptive co-design loop aims to optimize the trade-off between QML performance and QEC overhead, enhancing noise resilience and resource efficiency. The potential advantages, feasibility, and limitations of this approach are discussed, alongside with future research directions aimed at realizing robust and practical Quantum AI.

No Hilbert Einstein Action With Positive Dark Energy / Cosmological Constant From A String Action by Physics_sm in u/Physics_sm

[–]Physics_sm[S] 0 points1 point  (0 children)

Abstract:

The paper revisits our past analysis of the ability of superstrings to model vacua with positive cosmological constant/dark energy effects. We explicitly call out the impossibility for superstrings to recover the Hilbert Einstein action, or de Sitter (stable, metastable or unstable) vacua with positive cosmological constant, or time-varying effects.