Information as the key underlying physical principle

In summary, in the field of quantum mechanics, the concept of "information" has become the most fundamental unit. This is evident in phenomena such as quantum entanglement and black hole evaporation, where information plays a crucial role. There are numerous papers and books discussing this topic, including the work of Asher Peres and the paper "Relational EPR" by Rovelli and Smerlak. Quantum information is also being explored as a way to derive quantum mechanics itself, as shown in the paper "Informational derivation of Quantum Theory" by Chiribella, D'Ariano, and Perinotti. However, it is important to remember that information is a physical entity and cannot exist without a physical system to contain
  • #1
rumborak
706
154
It seems to me, at least when it comes to quantum mechanics, "information" has become the most basic unit. Like, quantum entanglement works to the point that information is extracted, and one can even revert certain things by making sure the information is destroyed. Same with the discussion about black hole evaporation etc, where the hinging point was that a black hole would destroy any information falling into it.

My question would be, is there any material (books, papers) discussing this? I find this whole subject very tantalizing, maybe because I work in machine learning where information is the basic unit too.

EDIT: Just to make sure what I'm asking, I'm not too interested in the specific black hole or quantum eraser experiments, but rather a slightly more "philosophical" discussion on what this shift might glean into physical "reality".
 
Physics news on Phys.org
  • #2
I think the late Asher Peres was a central figure in the development of Quantum Information Theory.

You might check in arXiv.org for papers by Peres_A

I'm very far from expert in this, don't know the literature. Some other people will probably answer over the next day or so, with better leads.

I know of Asher Peres in part through mention at the end of a paper called "Relational EPR" by Rovelli and Smerlak. You'll probably get the online PDF if you just google "relational EPR". It is a well-known paper. They quote Peres at the end. Basically he nailed it. QM is about information. things evolve continuously perhaps, but they interact in discrete quanta, and all we can really say about nature is based on those interactions. there is no grand cosmic observer. reality is relational. the story is always what is seen from some observer's standpoint, IOW information. And observers can reconcile their accounts when they communicate (but that is an ongoing process limited by the speed that information can be transmitted.

Bohr is quoted as saying something like "It's not what physical reality "IS" that concerns us, it's what we can say about it. In a subtle way, quantum reality is made of information (interactions, qubits, quantum events, maybe one could simply say "facts"). I got a lot out of the paper called "Relational EPR". but I can't pretend to broad knowledge of the quantum foundations, or quantum information theory, literature.
 
  • #3
Funny, I too work in IT and I have the exact same feeling... Please post further references if you find any...
 
  • #4
I've floated this question on PF in the past, and couldn't get much traction. I think the idea teeters too much on the edge of being purely philosophical.
But yes, I tend to agree with the points made so far. Physical existence as described by modern physics would seem (to me anyway) to be more of an informational manifestation rather than one that we would intuitively think of as "material". I've even go so far as to ask whether Noether's theorem might be used to demonstrate some type of universal symmetry that results in the conservation of information. Couldn't get much in-put on that. Perhaps the problem is the philosophical fuzziness of trying to posit a physical existence that is really only information at the fundamental level. It obviously can lead to a discussion about what, if any, role consciousness has in this description, and that will get the thread terminated pretty quickly.
 
  • #5
Here's the problem, as I see it. It sounds like you all work in IT, or some type of computer science anyway. So, you are very familiar with the software/hardware interdependence. But how do you run computations without ANY hardware. Can you guys help me with that idea?
 
  • #6
If the discussion isn't to veer into Philosophy, someone should state the precise definition of "information" in QM - if it has one.
(I hope answering that question will be simpler than answering the question "What are the domain and codomain of a wave function?")
 
  • #7
Feeble Wonk said:
Here's the problem, as I see it. It sounds like you all work in IT, or some type of computer science anyway. So, you are very familiar with the software/hardware interdependence. But how do you run computations without ANY hardware. Can you guys help me with that idea?

I don't think your analogy works. Physics, however you want to describe it, doesn't need "hardware" to run on. I mean, even in its current normal description, an electron is solely described by a few parameters, none of which are embedded into "hardware".
Similarly, the information that we're talking about, which *is* essentially the collection of parameters, wouldn't need hardware to run on.

I don't know, I thought the initial reply was very interesting, in how physics might be eventually more about the tend relations between things than the things themselves.
 
  • #8
That's essentially consistent with Leibniz's proposition of "relational space".
 
  • #9
rumborak said:
It seems to me, at least when it comes to quantum mechanics, "information" has become the most basic unit. Like, quantum entanglement works to the point that information is extracted, and one can even revert certain things by making sure the information is destroyed. Same with the discussion about black hole evaporation etc, where the hinging point was that a black hole would destroy any information falling into it.

My question would be, is there any material (books, papers) discussing this? I find this whole subject very tantalizing, maybe because I work in machine learning where information is the basic unit too.

Stephen Tashi said:
If the discussion isn't to veer into Philosophy, someone should state the precise definition of "information" in QM - if it has one.

http://arxiv.org/abs/1011.6451
Informational derivation of Quantum Theory
G. Chiribella, G. M. D'Ariano, P. Perinotti

But perhaps one should also remember that "information is physical" :)

http://arxiv.org/abs/quant-ph/0610030
Reference frames, superselection rules, and quantum information
Stephen D. Bartlett, Terry Rudolph, Robert W. Spekkens
"Recently, there has been much interest in a new kind of "unspeakable'' quantum information that stands to regular quantum information in the same way that a direction in space or a moment in time stands to a classical bit string: the former can only be encoded using particular degrees of freedom while the latter are indifferent to the physical nature of the information carriers."
 
  • #10
atyy said:
http://arxiv.org/abs/1011.6451
Informational derivation of Quantum Theory
G. Chiribella, G. M. D'Ariano, P. Perinotti


Does this imply that definition of "information" in QM is particular to that paper? Is the definition impossible to state in a concise manner?
 
  • #11
Stephen Tashi said:
Does this imply that definition of "information" in QM is particular to that paper? Is the definition impossible to state in a concise manner?

Yes, that's particular to the paper.

It should be impossible to state in a concise way. Classical information theory is a branch of classical probability theory. Is quantum mechanics a generalization or a special case of classical probability?

It depends on one's interpretation. The Chiribella et al paper stands in a long line of reformulation of quantum mechanics as a generalization of probability. In classical probability, pure states are extreme points of a simplex, which is not true in quantum mechanics.

On the other hand, Bohmian Mechanics shows that at least some forms of quantum mechanics (eg. non-relativistic quantum mechanics) can also be interpreted as special cases of classical probability theory.

To use an analogy from classical physics: do we live in the best of all possible worlds? :D Yes, from the point of view that it can be formulated form an action principle. However, the action is not always unique. For example, classical general relativity is the best of all possible worlds by the Hilbert action, the Palatini action, the Holst action etc.
 
Last edited:
  • #12
atyy said:
Yes, that's particular to the paper.

It should be impossible to state in a concise way. Classical information theory is a branch of classical probability theory. Is quantum mechanics a generalization or a special case of classical probability?

Is it important to discuss the papers definition of "information"?

I, personally, am not adverse to discussing questions in a philosophical way, but isn't the policy of the forum that discussion should deal with specific technical issues? (Perhaps "Beyond The Standard Model" has some leeway?)
 
  • #13
Stephen Tashi said:
Is it important to discuss the papers definition of "information"?

I, personally, am not adverse to discussing questions in a philosophical way, but isn't the policy of the forum that discussion should deal with specific technical issues? (Perhaps "Beyond The Standard Model" has some leeway?)

I was suggesting that paper's definition as a technical definition (to start). So we can have philosophy and technical discussion by comparing the different axiomatizations of quantum mechanics. Let's avoid Bohmian Mechanics here. Bohmian Mechanics aims to remove the notion of observers from quantum mechanics. However, the Chiribella et al paper stands in the Copenhagen tradition of quantum mechanics as an operational theory: yes observers are physical, and presumably there should be some way to describe observers and quantum systems consistently, but since in practice we have no trouble distinguishing our classical selves from quantum systems, let's describe quantum mechanics in terms of what operations we can do. Here are several alternative axiomatizations (there's old papers too, but these new ones are more readily available).

http://arxiv.org/abs/quant-ph/0101012
Quantum Theory From Five Reasonable Axioms
Lucien Hardy


http://arxiv.org/abs/1011.6451
Informational derivation of Quantum Theory
G. Chiribella, G. M. D'Ariano, P. Perinotti

http://arxiv.org/abs/quant-ph/0508042
A limit on nonlocality in any world in which communication complexity is not trivial
Gilles Brassard, Harry Buhrman, Noah Linden, Andre A. Methot, Alain Tapp, Falk Unger
 
  • #14
atyy said:
I was suggesting that paper's definition as a technical definition (to start). So we can have philosophy and technical discussion by comparing the different axiomatizations of quantum mechanics.

That would be a bit of a thread hijack, I think, unless those approaches define different types of "information". Do they?
 
  • Like
Likes wabbit
  • #15
Stephen Tashi said:
That would be a bit of a thread hijack, I think, unless those approaches define different types of "information". Do they?

Yes and no. I will say no, because the Chiribella et al paper is at least partly inspired by Hardy's. The Brassard et al is about communication complexity, so I think that's easily informational. For yes - I think we could discuss the difference between the Chiribella et al axioms and Hardy's - are the former really more "informational"?

And really, you can take the discussion wherever you want. I'm just suggesting some solid ground to stand on to start if anyone is interested.
 
  • #16
Early in the Hardy paper, I have a mathematician's frustration with the language of physicists:
The state
associated with a particular preparation
is defined to be (that thing represented by) any
mathematical object that can be used to deter-
mine the probability associated with the out-
comes of any measurement that may be per-
formed on a system prepared by the given prepa-
ration

The ambiguous phrase "any mathematical object that can be used to determine...of any measurement" might imply that if there exists some measurement whose probabilites a mathematical object can determine then the mathematical object is a state. Or it might imply that if a mathematical object has the property that for any measurement performed on a system, the mathematical object can be used to determine the probabilities of its outcomes, then the mathematical object is a state. From the subsequent text, the latter alternative is what is meant.

Then there is objectionable language:
The number of degrees of freedom,K,is defined
as the minimum number of probability measure-
ments needed to determine the state, or, more
roughly, as the number of real parameters re-
quired to specify the state

I know of no technique to make a direct "probablity measurement". One measures observed fequencies of events. Perhaps physicists have faith that the observed frequency of an event is equal to its probability. I think no harm is done if the language is modified to say that K is defined as the minimum number of probabilities that must be "given" or "specified" in order to determine the state.

There is mathematical question of whether a finite minimum exists. If we have infinite sets of probabilities, it may take an infinite number of "given" probabilites to determine it. So are we asssuming K must be a finite number? [Edit: He seems to say yes afterwards - if "N" represents a finite number.]
 
  • #17
marcus mentioned Rovelli's RQM in response to the OP, while my instinct was to mention Chiribella et al. I'd like to mention that although they are different in technical details, they are similar in spirit, in that both are about being able to consistently shift the "Heisenberg cut" between observer and quantum system. Rovelli's RQM is a radical view of reality that is meant to also solve the measurement problem, but while everyone who knows quantum mechanics will feel its spirit is right, it is less clear if such a radical view really works (Rovelli needs a technical kludge in his paper). On the other hand, the Chiribella paper does not solve the measurement problem, and so while also dealing with the consistency of shifting the "Heisenberg cut", does reproduce orthodox QM (in finite dimensional Hilbert spaces).
 
  • #18
Stephen Tashi said:
Early in the Hardy paper, I have a mathematician's frustration with the language of physicists:

Your complaint is unfounded, since you seem you to have understood the languauge after all :p

Stephen Tashi said:
So are we asssuming K must be a finite number? [Edit: He seems to say yes afterwards - if "N" represents a finite number.]

One limitation of these axiomatizations of QM is that (I think) they only reproduce QM in finite dimensional Hilbert spaces. From the physics point of view, this is mostly considered ok, since we should be able to make the Hilbert space large enough for all practical purposes.

I should mention that Ken G here on PF has pointed out some problems with the Hardy paper (nothing that cannot be fixed). Hardy also has a later slightly different axiomatization, but I listed the 2001 paper because it is considered a classic.
 
Last edited:
  • #19
From Hardy:

The dimension, N , is defined as the maximum number of states that can be reliably distinguished from one another in a single shot measurement.

I don't know what a "single shot measurement" is. Is it a measurement consisting of a single number which might be one of many possible values that can be realized from a probability distribution? Or does a single shot measurement give a value that represents a probability?

Does this refer to the dimension N of a system" or is it referring to a dimension N of a state? If , for each state of a system, a "single shot" measurement( taken on whatever state the system is in) allows us to distinguish between N states, then N would be a number associated with the "system" If the number N depends on the particular state the system when the measurement is taken, then N is a number associated both with a particular state and the particular system. Different states might have different "dimensions N".
Axiom 1 Probabilities. Relative frequencies (measured by taking the proportion of times a particular outcome is observed) tend to the same value (which we call the probability) for any case where a given measurement is performed on a ensemble of n systems prepared by some given preparation in the limit as n becomes infinite.

That's Ok for a physicist to say, but it is not what the mathematical Law Of Large Numbers says unless we elaborate the meaning of "in the limit". The type of limit mentioned in the Law Of Large Number is a statement involving a limit of a probability of an observed frequency, not the limit of an observed frequency (with no mention of the probability of the observation.) Later Hardy says that Axioms I-V give classical probability theory. Mathematically speaking, Axiom I doesn't, but Axiom I is a practical way of thinking about the world.

Axiom 2 Simplicity. K is determined by a function of N (i.e. K = K(N )) where N = 1, 2, . . . and where, for each given N , K takes the minimum value consistent with the axioms.

As I understand the paper, each state has a "degrees of freedom K" and there is no comment on whether K may vary from state to state. To say K is a function of N might mean one of the following:

1. For a given system , the K for a state in the system is a function K(N) of the N for the state. But for a different system K might be a different function of N.

2. There a single function K(N) that gives the value of K for a state in a system as a function of the N for the state and this function works for all systems.

3. Each state in a given system has the same value K and the system has a single value N. There is a function K(N) that gives K as a function N as N varies from system to system.
 
Last edited:
  • #20
Stephen Tashi said:
but it is not what the mathematical Law Of Large Numbers says unless we elaborate the meaning of "in the limit".
Probability theory tells how to derive a new probability distribution from old probability distributions…….. It does not tell how to get a probability distribution from data in the empirical world.

In the theorem of large numbers, the existence of the probability distribution is not deducted, it is postulated. In mathematic this is an evidence like if we say : The arithmetic tells us how to calculate a new number from old numbers ... It does not tell us how to get a number from the information of the "real world".

Patrick
PS
It seem (page 10) that Kolmogorov write « I have already expressed the view ...that the basis for the applicability of the results of the mathematical theory of probability to real random phenomena must depend in some form on the frequency concept of probability, the unavoidable nature of which has been established by von Mises in a spirited manner...(But) The frequency concept (of probability) which has been based on the notion of limiting frequency as the number of trials increases to infinity, does not contribute anything to substantiate the applicability of the results of probability theory to real practical problems where we have always to deal with a finite number of trials » in Shankhya, 1963.
 
Last edited:
  • #21
From Hardy:

Axiom 3 Subspaces.
A system whose state is constrained to belong to an M dimensional subspace (i.e. have support on only M of a set of N possible distinguishable states) behaves like a system of dimension M .

We are told earlier that a system has "states" (plural) and here is Axiom 3 with the language "A system whose state..." (singular). Axiom 3 must mean to say "A system, each of whose states is constrained to belong to an M dimensional ..."

I think the phrase "M of a set of N possible distinguishable states" means "The set of states of the subsystem system has cardinality M and is a subset of the N distiguishable states of the system" rather than "Each state of the subsystem is a set of cardinality M and is a subset of the N distinguishable states of the system" (which would leave open the possibility that the set of cardinality M was a different set for each state in the subsystem).

What "behavior" of a system means has not been defined, so we have a free hand about interpreting "behaves like".

The "distinguishable states" of a system in conjuction with the number "N" presumably refers to the previous discussion about the number of states that are distinguishable after a "single shot measurement". Does that discussion contain the implicit assumption that after any given "single shot measurement", not only the same number N of states are distinguishable, but also the same set of states is distinguishable? (Two different sets of states can have the same cardinality N.) Does a "single shot measurement" describe a single type of measurement. Or might different kinds of "single shot measurements" on the same system distinguish between different sets of states?
 
  • #22
@StevenTashi, I'm going to check out of the discussion for a while, have to finish some work and your questions are beyond my ability to give a quick answer. However, there are people on PF like bhobba, Ken G and kith who have discussed Hardy's work. I hope one of them, or someone else joins in.
 
  • #23
You might start with the Wiki entry "Digital Physics", especially the section on John Wheeler's "It from Bit"...
 
  • #24
On arxiv there is a book "From Classical to Quantum Shannon Theory" : http://arxiv.org/abs/1106.1445

http://www.albany.edu/physics/ACaticha-EIFP-book.pdf a book prepared by Ariel Caticha, from http://www.albany.edu/physics/research_highlight.shtml, for a tutorial on the subject of Entropic Inference and the Foundations of Physics to be presented at EBEB-2012, the 11th Brazilian Meeting on Bayesian Statistics. http://iopscience.iop.org/1751-8121/labtalk-article/46167Patrick
 
  • #25
From Hardy:

Axiom 4 Composite systems.
A composite system consisting of subsystems A and B satisfies N =NA NB and K = KA KB
If the definition of K and N were clear, the meaning of Axiom 4 woul be clear.

Axiom 5 Continuity.
There exists a continuous reversible transformation on a system between any two pure states of that system.

This doesn't specify whether the transformation is a physical transformation or merely a mathematical transformation "Pure state" hasn't been defined yet.

I've read several papers offering "new" approaches to Quantum Mechanics or trying to develop Quantum Mechanics from a novel set of axioms. The axioms weren't precisely stated and you could only make sense of them if you kept in mind the usual form of Quantum Mechanics. Hardy's axioms have this defect. if you imagine he is talking about the usual way to presenting Quantum Mechanics, you can fit his axioms to that approach. If an author is going to require using features of the old approach in his new approach then why not just plainly state the features that are being incorporated?

The first four axioms are consistent with classical probability theory but the fifth is not (unless the word “continuous” is dropped). If the last axiom is dropped then, because of the simplicity axiom, we obtain classical probability theory (with K = N ) instead of quantum theory (with K = N^2 ).
We don't obtain classical probability theory as a mathematical theory. We obtain a physical system described by classical probability theory.

It is very striking that we have here a set of axioms for quantum theory which have the property that if a single
word is removed – namely the word “continuous” in Axiom 5 – then we obtain classical probability theory instead.

Mathematical continuity of a transformation depends on what toplogy is used, so there is that little detail to settle.
 
  • #26
I've been trying to follow the thread, but the terminology/conceptual references are getting pretty fuzzy for me. I'd like to make sure I've at least got the general idea.
For the purposes of this discussion, the state vector of an entire entangled quantum system could be considered a "mathematical object", in that an observation/measurement can be made that will result in a probabilistic outcome...right? If so, would "K" be the number of parameters expressed in that quantum state that can be observed/measured which might differentiate between potential outcome vector states?
Also, would "N" be the total number of "potential" states of that system, or would it only be the total number of states that can be reduced to by a specific single observation and/or measurement? If it's the latter, then "N" would seem to be a measurement dependent quantity rather than an inherent system specific quantity.
 
  • #27
Stephen Tashi said:
From Hardy:

The "distinguishable states" of a system in conjuction with the number "N" presumably refers to the previous discussion about the number of states that are distinguishable after a "single shot measurement". Does that discussion contain the implicit assumption that after any given "single shot measurement", not only the same number N of states are distinguishable, but also the same set of states is distinguishable? (Two different sets of states can have the same cardinality N.) Does a "single shot measurement" describe a single type of measurement. Or might different kinds of "single shot measurements" on the same system distinguish between different sets of states?
 
  • #28
It looks like my question was a general restatement of TS's question, but I don't think I saw an answer to that.
 
  • #29
The Hardy paper may get clearer as one goes futher into it since he uses terms before he defines them. For example, after stating the Axioms, he reveals the terms "outcome" and "measurement" have different definitions. What a "single shot measurement" is, I still don't know.
 
  • #30
I think a single shot measurement is just one trial. For example, if we know that the state of a single quantum system is one of a set of (nondegenerate) orthogonal eigenvectors of an observable, then doing a single trial of a projective measurement of that observable will tell us which one of the orthogonal states the object was in. In this case, this is because each eigenstate has a different eigenvalue, and the measurement on any eigenstate yields the eigenvalue of the eigenstate with certainty.

In contrast, in quantum mechanics, if a single object is in one of two non-orthogonal states, a single measurement cannot tell us with certainty which of the two non-orthogonal states the object was in. If we have many copies of the object all in the same state (which is one of the two non-orthogonal states), then measuring the ensemble (ie. many trials and possibly many different types of measurement, not a single shot) can tell us which of the two non-orthogonal states is present.
 
Last edited:
  • #31
A pure state is an extreme point in a convex set of states. Hardy gives examples of pure states in Eq 2.

I don't know if this is true in general, but in both classical physics and in quantum mechanics, it means that within the theory, a pure state can be taken to be the complete state of a single object. Then an ensemble in which 50% of the objects are in pure state A and 50% are in pure state B is said to be in a mixed state.

In classical physics, relying on classical probability, the convex set of states is a simplex. In quantum mechanics, the convex set of states is not a simplex (usually drawn as a circle).
 
Last edited:
  • #32
This leaves me still confused about the relationship between "N" and "M" in Hardy's argument. I had initially envisioned N as the total of all potential states of an entangled quantum system, and interpreted M to be a subset of those states that might be reduced to by establishing a specific parameter (K) of the system by measurement/observation.
All of this is well above my competency level in mathematics, quantum mechanics and/or information science, but I had speculated that it might be something like the conditional "rescaling" of probabilities as described on page 13 of this paper on "States of Convex Sets".
http://www.cs.ru.nl/B.Jacobs/PAPERS/convex-states.pdf

But, if N is the quantity of outcome states from a SINGLE measurement, how is the subset M of N defined?
 
  • #34
atyy said:
Let's avoid Bohmian Mechanics here. Bohmian Mechanics aims to remove the notion of observers from quantum mechanics.

http://arxiv.org/abs/1011.6451
Informational derivation of Quantum Theory
G. Chiribella, G. M. D'Ariano, P. Perinotti
Sorry, I disagree that we should avoid BM here. In http://arxiv.org/abs/1103.3506 I have developed (or, more accurate, propose to develop) a Bayesian variant for BM, where we have a real configuration q(t) -- the Bohmian trajectory -- but the wave function has a Bayesian interpretation, thus, describes only our incomplete knowledge about q(t), and not some independent really existing animal.

And now, thank you for this link, I was not aware of it, it appears that they have found a derivation of QM which very nicely fits into such a Bayesian approach. We have, in this paper, a list of axioms which are fulfilled also for classical theory, and, then, a single principle which distinguishes quantum mechanics. And what it is?

"Informally speaking, our postulate states that the ignorance about a part is always compatible with a maximal knowledge of the whole." Wow. It remains to understand what this exactly means, but I think "maximal knowledge" can be translates as "maximal knowledge possible/available in quantum mechanics", so quantum states can be understood as states of knowledge, which are restricted by the ignorance "ignorance about a part".

This looks like a chance to understand quantum wave functions in a way similar to thermodynamic states with maximal entropy simply as not some special physical states predefining frequencies, but as states which describe information we have about the real states. So, very nice news for me. (It was, obviously, an big error to ignore this direction after seeing a lot of papers which I have not found interesting at all.)

The problem I have seen with the informational approach has always been the question "information about what?", and this is a question nicely answered by the Bohmian approach: information about the trajectory.
 
  • #35
I've figured out an interpretation of Hardy's approach, which I shall write up. Briefly, the facts are:

I will call Hardy's "Measurement Device" an "Outcome device" because by Hardy's terminology, the device produces "outcomes", not "measurements". You may think of the Outcome device as having L+1 real numbers stamped on it and having a pointer that points to one of these numbers after the system enters it. Hence the set of possible "outcomes" is the same for all experiments. It doesn't change as a function of anything. Denote the set of outcomes by {r{0], r[1], r{2] ...r{L+1]}. We stipulate that the pointer points to r[0] = 0 when no system is present in the device.

A "measurement" is the following procedure: Fix the knobs on all the devices. Define a particular event E in terms of the possible outcomes. The event E will be a statement that the observed outcome is in some given subset of the possible outcomes. (It might be a statement such as "The outcome is r[1] " or a statement such as "The outcome is r[1] or r[2] or r[5] ".) Perform repeated experiments and determine the probability p of the event E. The probability p is the result of the "measurement". The result of a measurement is not a vector of several probabilities. It is single number.

Because "measurement" and "outcome" are different concepts in the context of Hardy's approach, it is best to avoid hybrid phrases like "outcome of a measurement" or "measurement of an outcome", since they are ambiguous..

To define the "degrees of freedom" K of the population of physical systems being input into the experimental equipment, it is necessary to talk about a subset of a certain kind of functions being a "basis" for the entire set of functions. Considering a "measurement" to be a function, it computes a probability p as a function of the following variables:
S_P : the knob setting on the Preparation device.
S_T: the knob setting on the Transformation device
S_O: the knob setting on the Outcome device
E: The event that has been defined as a subset of the possible outcomes

I (not Hardy) define a "measurement function" to be a function h(S_P, S_T) defined by fixing the variables S_O and E. So the set of "measurement functions" is a family of functions whose variables are (S_P,S_T) and the family is parameterized by parameters S_O and E.

(Hardy says in the beginning of the paper that unless otherwise noted, we are to assume the Transformation device does not change the state of the system. However, I think no harm is done by including the variable S_T in my definition. I will assume there exists a knob setting S_Tnull on the transformation device that effectively removes the Transformation device from the experiment and allows the system to go directly from the Preparation device to the Outcome device. If the knob is to be set to S_Tnull, I will say so explicitly. )

To express the idea that there is a set of information for a state that allows us to deduce all other information about it, make the following assumption:
There exists at least one subset B = {h1,h2,..hk} of the family of measurement functions such that any measurement function h can be expressed as some function F(h1,h2,..hk) of the measurement functions in B and B has the smallest cardinality possible for a set with this property. The function F may be different for different measurement functions h, but for the particular h in question, its particular F "works" for all values of (S_P, S_T).

The cardinality K of the B is the "degrees of freedom" of the population of physical systems that are inputs to the experiments.

We can define "the number of distinguishable states" of a population of systems by avoiding any technical discussion of "state" and instead defining "state vectors". Define the function H(S_P,, ST) to be the vector valued function H(S_P,S_T) = ( h1(S_P, S_T), h2(S_P, S_T),...hk(S_P, S_T) ), where the h's are the functions in B. Evaluated at a particular (S_P, S_T) , the value of the H is a vector of probabilities. A vector of such probabilities is defined to be a "state vector".

Set S_T= S_Tnull. Vary S_P over all possible settings and compute H(S_P, S_Tnull) for each S_P. We will assume only N distinct values of H(..) occur. (i.e. only N distinct "state vectors" are observed.) The number N is the "number of distinguishable states" of the population of physical systems. The number N is also the "dimension" of the population of physical systems.

The assumption that N is some function of K treats K as a variable. The interpretation is that there is a (single) function F such that any population of physical systems that has degrees of freedom K has dimension N = F(K). It is a statement that considers varying the population of physical systems that are used as inputs to experiments. (I find this a surprising assumption.)
 
Last edited:
  • Like
Likes atyy

Similar threads

Replies
7
Views
2K
Replies
21
Views
4K
Replies
2
Views
2K
Replies
14
Views
4K
Replies
16
Views
1K
Replies
3
Views
3K
Back
Top