Retired computer engineer here. I have a professor's unpublished notes, back to 1974 entitled "Computer Communication Theory" that containing a great deal of probability, random variables, and Markov chain sections as well as old Bell Systems Journal notes about hardware and protocols. Has this...
Is there any use for this concept in classical branches of physics? Can it be of any help for a physicist in resolving problems (or, at least, in resolving them more efficiently when compared with traditional methods)?
The word «classical» means exactly that, i. e. mechanics, hydrodynamics...
My understanding of quantum theory and information theory is that, given complete information on the state of the universe at present, it is possible to predict its state at all times in the future and past. 3 questions: 1: is this true? 2: how are quantum-probabilistic outcomes accounted for...
[Mentor Note -- thread moved from the schoolwork forums to the technical forums]
Homework Statement:: Tentative Note and summary on the origin and the evolution of information in the universe.
Relevant Equations:: none
As a teacher of physics I got many questions asked by my students when...
I was reading a paper written by George Smoot [1], which assumes the holographic principle as true and conjectures that our universe would be encoded on the "surface" of an apparent horizon as the weighted average of all possible histories. In that way, there would be one world (or universe)...
Do you have an opinion about my summary above?
Do you understand the relation between irreversible logic and irreversible process?
According to Landauer, logical irreversibility implies physical irreversibility. This is still a topic of debate it seems to me. Is the debate also about what logic...
The K-L condition has projection operators onto the codespace for the error correction code, as I understand it. My confusion I think comes primarily from what exactly these projections are? As in, how would one find these projections for say, the Shor 9-qubit code?
In general, if R is the recovery channel of an error channel ε, with state ρ, then
and according to these lecture slides, we get the final result highlighted in red for a bit flip error channel. I am simply asking how one reaches this final result. Thank you (a full-ish derivation can be found...
As I've been studying statistical mechanics as well as some other things, I keep hearing about "information theory". For instance, I've heard about information theory as it relates to entropy, regarding some theorems of statistical mechanics, and I even heard about it in a Carl Bender lecture...
I only see it brought up in creationist attacks on evolution, definitely NOT trying to bring that up - curious if and how real biological science uses it. There are a couple of (expensive) older books and paywalled papers that seem legit, but cannot find much else
for example...
The paper is reasonably old and was written as a phd thesis by (I believe) a man from china. It was basically the first paper on the subject and in it he effectively (from what I understand) dropped particles into a black hole, counting the information added, and saw that the black hole changed...
Hello!
I would like your help to study Science graduate level books and articles, in the following subjects:
1. Far from equilibrium statistics.
2. Information theory and entropy.
3. Negentropy.
4. And Maxwell's demon.
My main goal is to be able to understand and explore the Maxwell's demon...
Hi everyone,
this is sort of a soft question which I need to ask to make sure my understanding is correct, it relates to a little project I'm doing on measurement resolution. The first question is to clear up a general concept, the second is based on the first and is the actual question...
Hi, I’m interested in self studying so that I can learn / understand integrated information theory about counciousness. I was wondering if anyone could help me identify what courses (I’m looking at using MIT’s opencourseware to study although just types of math is all that is needed) I would...
So let suppose I have a random variable Y that is defined as follows:
$$Y=\alpha x+ \mathcal{N}(\mu,\sigma) \text{ where } x \text{ } \epsilon \text{ }\mathbb {R}$$
and
$$\mathcal{N}(\mu,\sigma)\text{ is a i.i.d. normally distributed random variable with mean }\mu\text{ and variance }\sigma$$
So...
Hello Physics Forums Community,
I'm Cliff, a DBA working at present for a financial company, with a background in philosophy and art. (So expect many of the questions I will be asking to be at the 'math for English majors level' :-) ) I have been recently been working with Fred I. Dretske's...
1-Whats the relationship between entropy and İnformation ?
2-Can Entrophy always increases statement imply information lost ?
3-If it implies how its lost ?
Consider three identical boxes of volume V. the first two boxes will contain particles of two different species 'N' and 'n'.
The first box contains 'N' identical non interacting particles in a volume V. The second box contains 'n' non interacting particles. The third box is the result of mixing...
An interesting paper has appeared on nature.com:
http://www.nature.com/articles/srep32815
The abstract:
I expect this to spawn plenty of pop science claims about "scientists say we can reverse entropy". But the paper itself looks like a good discussion of how the second law actually works...
Homework Statement
I am not a student, but one poster was kind enough to answer my stupid question last week, and I was wondering if anyone would mind if I posted another stupid question.
When an object is moved in a specific direction, how is the direction of momentum stored or recorded. By...
I am reading a book called 'quantum processes systems and information' and in the beggining a basic idea of information is set out in the following way. If information is coded in 'bits', each of which have two possible states, then the number of possible different messages/'values' that can be...
Hello, I have to work on the relation between the thermodynamics and the information theory on both historical and theoretical aspects. My work will not contain proof. It will contain the most important equations and descriptive paragraphs. I need to talk about the relation between Clausius and...
Can anyone recommend any good reading on Maxwell's demon? I'm mostly looking for things at the undergraduate level, but I don't mind something less rigorous or more advanced.
(Apologies to the mods if this is in the wrong forum.)
I have made an interesting observation that I can't explain to myself. Think about a prior probability P and a posterior probability Q. They are defined on an event space W with only three elements: w1, w2, and w3 (the number of elements won't matter as long as it's finite). The Kullback-Leibler...
Hi all!
I would like to learn the basics of information theory and want a good book to do so.
My math level is that of a second year undergraduate physics student, but I don't mind if I have to struggle a bit through it.
Thanks!
In Euclidean geometry (presumably also in non-Euclidean geometry), the part of the dissecting line that dissects the vertex angle and is inside the isosceles triangle is shorter than the legs of the isosceles triangle. Let ABC be an isosceles triangle with AB being the base. Then, for...
HI,
I am reading Shannon's paper on Theory of Communication and I having trouble with a concept.
Shannon writes:
The output of a finite state transducer driven by a finite state statistical source is a finite state statistical source, with entropy (per unit time) less than or equal to that of...
I would like to know in the following equation (attached) how can I incorporate BER for BPSK? is BER the same as Rc?
The equation is relation between SNR and sigma square.
I understand how using classical or bayesian statistical inference os often very helpful for solving information theory problems, or for improvements in data managing or manipulation of learning algorithms. But the other way around (using I.T knowledge to find a way in inference), I can't find...
We Now From Information Theory That Entropy Of Functions Of A Random Variable X Is Less Than Or Equal To The Entropy Of X.
Does It Break The Second Law Of Thermodynamic?
Hi everyone, I'm looking to go to graduate school for a Master's in scientific computing or computational science but want to go back for a PhD in physics. I'm just starting to look into quantum information theory and while I find plenty of PDF files and articles about the topic I can't find...
In (Shannon) information theory, information is said to be the log of the inverse of the probability - what is the information content of 1+1=2? Or for that matter any fundamental axiom.
I was trying to understand wave function collapse in terms of superposition, but I ran into some problems when relating back to information theory/entropy. It is given in the definition of information in terms of entropy energy is needed to transfer information. That is something we have always...
Homework Statement
Let X1, . . . ,Xn be a message from a memoryless source, where Xi are in A. Show
that, as n →∞, the proportion of messages in the typical set converges to zero,
unless Xi is uniform on A.
Homework Equations
The Attempt at a Solution
Confused, possibly because...
I will finish my Master studies in theoretical physics next year, and I want to do a PHD in Quantum Information (my main interest). I want to apply to universities in the UK and Germany (right now I restrict myself to these two countries). What are the universities you know of, that have Quantum...
Information Theory - Shannon's "Self-Information" units
Hi,
I'm familiar with information and coding theory, and do know that the units of Shannon information content (-log_2(P(A))) are "bits". Where "bit" is a "binary digit", or a "storage device that has two stable states".
But, can...
Hello,
I would like someone to suggest me a good book on entropy and information theory.
I need something that explains these subjects intuitively, rather than all mathematics.
I have fairly strong knowledge of mathematics behind entropy, but all is kind of scrambled what is what...
I do not know if this is the right place for this post, but if I am doing a mistake by putting it here, If it is so, please let me know where is the right place to put it.
So, I am learning Information Theory, this is first approach and I would like to know a few names of good books for...
does the fact that there is a limit on how much can be observed on electrons location and momentum have anything to do with the finiteness and conservation of information?
is the total momentum plus location of an electron unknown to us or is it also unknown to the universe?
meaning, does...
Hi!
As I understand it, Quantum Information Theory is an attempt to apply the Classical Information Theory (i.e 0's and 1's) into the Quantum realm of superpositions.
I recently came across a fascinating interpretation of QIT wherein it was described as possibly the law that effectively...
Hi,
I am a student in Europe. I have been reading through the posts on Graduate schools and the essays like the one by Zapper on how to become a physicist. I am now in the stage of searching for a PhD position.
1. I would like to know what you guys think about the scope of Quantum...
So I am enrolled into this course.
And we are learning probability and statistics and all that good stuff.
But one thing bothers me VERY much.
I lie, a lot of things bother me about this course but here is the first one:
Transformation of random variables.
So far we have been...
Hello All,
I have not yet entered undergrad EE. I am in a mid year break and need a intro level book. It should be formal and factual and where necessary technical, but at the same time it should outline:
- historical development of electronics and electrical technologies
- historical...
Hi, although I've studied info theory briefly in the past, now revisiting it, I seem to be scratching my head trying to understand the counter-intuitive logic of it.
For instance,
I understand that
the amount of uncertainty associated with a symbol is correlated with the amount of...
What is the consensus here about Information Theory beyond the Standard model?
The three fundamental theories of the universe, relativity, quantum mechanics, and the second law of thermodynamics all involve limitations on the transfer, accessibility, quantity, or usefulness of information...
Hi Guys,
I am in real dellima to take between this course. Obviously I prefer theory but I know my limitation too. Can some one shade between this 2 course which one is really good not interms of employment but interms of knowledge and content. I might decide between this 2 course for my grad...
The Information Theory and Cybernetics (Shannon, Wiener) and the perspective of some physicists (Schrödinger) was very influential on the development of Molecular Genetics.
The molecular genetic approach showed its power to explain a lot of biological and medical problems, from evolution...
Hi everyone, I'd like to start by saying that I have no background or knowledge of maths other than high school. As a consequence, I have no idea if this question is stupid, trivial, nonsensical or whatever. I also don't even know if it's in the correct thread (I made the best guess I could). So...
Homework Statement
A) Select uniquely decodable codes and instantaneous codes from Code1 to 5 of the image below:
B) Personal question about second-order extension probabilites. If we have:
probability of a symbol a P(a) = p1
probability of a symbol b P(b) = q1
Which is the...