Do entanglement networks encode a unique, local historical record?

  • A
  • Thread starter asklepian
  • Start date
  • Tags
    Decoherence
  • #1
asklepian
5
0
TL;DR Summary
How does decoherence transition quantum systems to classical states by encoding environmental interactions? Can light cones be seen as local entanglement networks reflecting spacetime's causal history? Is superluminal travel barred by decoherence and causality issues?
In the recent paper "The Decoherent Arrow of Time and the Entanglement Past Hypothesis" by Al-Khalili and Chen, the authors propose that the early universe had very low entanglement entropy, which increases over time and gives rise to the decoherent arrow of time.

1.) Can you elaborate on how decoherence functions as the process by which an initially indeterminate quantum system becomes effectively determined through interactions with its environment? Specifically, how does this process encode information about the system's state into the environment, thus creating a unique local historical record of these interactions? Additionally, what are the quantitative measures, such as von Neumann entropy or entanglement entropy, that can be used to describe this transition from a quantum superposition to a classical mixture?

2.) Considering that light cones define regions of spacetime within which causal interactions can occur, can we interpret these regions as encoding a local network of quantum entanglements that reflect the unique historical interactions within each light cone? How does this interpretation align with our current understanding of quantum entanglement and the causal structure of spacetime in the framework of general relativity? Are there any existing models or quantitative frameworks that describe this relationship?

3.) Is the prohibition of superluminal travel related to increased decoherence in regions of high energy density, potentially disrupting local entanglement correlations and leading to indeterminate states and causality violations? If superluminal travel were hypothetically possible, would superluminal disentanglement and subsequent subluminal reentanglement allow the encoding of a 'new' history, leading to causality violations? How do modern theoretical frameworks, such as quantum field theory and general relativity, quantitatively address these issues?
 
Last edited:
Physics news on Phys.org
  • #2
asklepian said:
In the recent paper "The Decoherent Arrow of Time and the Entanglement Past Hypothesis" by Al-Khalili and Chen, the authors propose that the early universe had very low entanglement entropy, which increases over time and gives rise to the decoherent arrow of time.

1.) Can you elaborate on how decoherence functions as the process by which an initially indeterminate quantum system becomes effectively determined through interactions with its environment? Specifically, how does this process encode information about the system's state into the environment, thus creating a unique local historical record of these interactions? Additionally, what are the quantitative measures, such as von Neumann entropy or entanglement entropy, that can be used to describe this transition from a quantum superposition to a classical mixture?
The authors reference Hartle and Gell-Mann. They are a good place to start. https://arxiv.org/pdf/gr-qc/9509054 In this paper they discuss how "classicality" might be quantified.

Indeterminacy is only eliminated in the sense that each possible history from an appropriate set will exhibit time correlations well-approximated by classical physics. The resolution of a unique history by our measurements is still only given probabilistically.

[edit] - The above paper by Gell-Mann and Hartle references literature of theirs exploring classicality (or classicity) that might be better to start with: https://arxiv.org/abs/1803.04605 and https://arxiv.org/abs/gr-qc/9210010 and https://arxiv.org/abs/1905.05859
 
Last edited:
  • #3
For a bit of a different perspective, I would also recommend The arrow of time in operational formulations of quantum theory by Di Biagio, Donà, and Rovelli:
https://arxiv.org/abs/2010.05734

The operational formulations of quantum theory are drastically time oriented. However, to the best of our knowledge, microscopic physics is time-symmetric. We address this tension by showing that the asymmetry of the operational formulations does not reflect a fundamental time-orientation of physics. Instead, it stems from built-in assumptions about the users of the theory. In particular, these formalisms are designed for predicting the future based on information about the past, and the main mathematical objects contain implicit assumption about the past, but not about the future. The main asymmetry in quantum theory is the difference between knowns and unknowns.
 
Back
Top