- #1
jack476
- 328
- 125
I just got Pinter's book, "A Book of Abstract Algebra", for the modern algebra course that I'm taking. It's a very nice book, I'm enjoying reading through it so far.
What's especially interesting is the connections to computer science and controls, mostly because I switched to math and physics out of electrical engineering. Anyway, in its introduction chapter on groups, it makes the following statement:
(Emphasis mine).
Out of curiosity, I am wondering if there exists a proof of this statement, that is, that any and every single piece of information in the universe, of arbitrary complexity and abstraction, could be encoded as a string of binary digits, assuming one could access that information and had a storage device large enough.
Intuitively, I would say that it's obvious, a piece of information can be stored in every digit and in theory we can always increase the information capacity by adding digits, but I'm wondering if a rigorous proof exists.
(Note: I put this in the abstract algebra section because it came up in an abstract algebra textbook, I will understand if the mods feel it is more appropriate in the computer science section).
What's especially interesting is the connections to computer science and controls, mostly because I switched to math and physics out of electrical engineering. Anyway, in its introduction chapter on groups, it makes the following statement:
Groups in Binary Codes
The most basic way of transmitting information is to code it into strings of 0s and 1s, such as 0010111, 1010011, etc. Such strings are called binary words, and the number of 0s and 1s in any binary word is called its length. All information may be coded in this fashion.
(Emphasis mine).
Out of curiosity, I am wondering if there exists a proof of this statement, that is, that any and every single piece of information in the universe, of arbitrary complexity and abstraction, could be encoded as a string of binary digits, assuming one could access that information and had a storage device large enough.
Intuitively, I would say that it's obvious, a piece of information can be stored in every digit and in theory we can always increase the information capacity by adding digits, but I'm wondering if a rigorous proof exists.
(Note: I put this in the abstract algebra section because it came up in an abstract algebra textbook, I will understand if the mods feel it is more appropriate in the computer science section).