Is DNA Information? A Look Into Biologists' Perspectives

In summary: There is a point here that is very similar to one of the common misconceptions about evolution. Usually, when two species have a similar trait, we explain that as being due to the fact that the trait existed in the common ancestor of the two species. But what about the cases where a similar trait in two species must unarguably have evolved separately because it evolved in each species after the two species diverged? The key point is to understand that the reason why the trait is similar is because the environmental pressures that led to its development were similar, so it is not so surprising that the evolved solutions were similar.
  • #36
Perhaps this is tangental to where the discussion has gone, but I've just discovered the thread...

Douglas Hofstadter, in Godel, Escher, Bach, questions whether the instructions to make a given set of proteins can really be said to be stored in DNA. His discussion is extensive (being woven through several chapters), but it comes down to asking how much of that information is in the base sequences in the DNA and how much is added by the chemical context in the cell? By analogy, he asks how much of the meaning of a text (like the Rosetta stone) is in the characters and how much is in the context provided by the culture? Or how much of the emotional response to a song is in the melody or chord sequence and how much is added by the listener? How much of the meaning of a John Cage composition is carried by the context of the tapestry of western music?

He also refers to Schrodinger's observation that humans expect to find meaning in aperiodic crystals: things that show regular form on level level but not on another. A book (being a bound, orderly stack of rectangular pages with markings on the pages that are draw from a small set of characters) has a regular form. The order of of the markings, however, is not obviously patterned until a decoding mechanism is found.

It's a very engaging discussion, which I probably do not do justice to in my summary.
 
Biology news on Phys.org
  • #37
SW VandeCarr said:
The concept of information has actually been quite well defined since Claude Shannon published his work on information theory in 1947. The Shannon information of an event (such as a particular sequence of base pairs) E can be expressed:

[tex]I(E)=-c (log_2 (P(E)))[/tex]

where P(E) is the probability of E and c is a positive constant.

There is a close relation between entropy and information as formal concepts. Entropy is a function of the number possible states of a system and information is a function of the realization of a particular state of that system. If a system can have many possible states, then the realization of a particular state will have a high value in information measure. With log base 2 the measure is 'bits'. A system that can exist in only one state will have zero bits of information with the realization of that state.

All very true, when I said information is ambiguous I meant the colloquial use. Some will use it as "language", some geneticists might say that most of the DNA is not information if they are referring to non-coding sequences etc etc.
 
  • #38
ryan_m_b said:
All very true, when I said information is ambiguous I meant the colloquial use. Some will use it as "language", some geneticists might say that most of the DNA is not information if they are referring to non-coding sequences etc etc.

Well think about how we represent any kind of knowledge, or even the neural pathways by which sense data is processed, stored and acted upon. Langauge is not a random collection of sounds or symbols. Given all possible configurations of a string of length n using k symbols including spaces (its entropy), only a small fraction of these configurations carry information in any human language, and a still smaller fraction in a language you might understand. Of all possible visual patterns that could be processed by the human retina, only small fraction are patterns that make sense based on our training and expectations.

In pharma R&D there are sophisticated programs for visualizing the conformal dynamics of proteins and other activities involving receptor sites and ion channels Obviously these programs are nothing more or less than pure information which can be measured in bits (or nits or dits if you prefer, but I'll stick with bits. The other two use the natural log base or base 10).
 
Last edited:
  • #39
DaveC426913 said:
Thing is, I am finding a concept here that identifies a particular kind of information. I just haven't figured out how to define it yet.

The closest I can come is 'blueprints'.

There's all sorts of information out there in the universe but only a tiny subset of it is used the way life uses it - as a language that follows rules to build stuff. I can't think of a single non-life example where configurations are stored in an abstract form, then "read" to make something.

Ah, I hadn't noticed your other post. I think the general form of information doesn't require life, but yes I agree that life "makes use" of information in a sophisticated way, social creatures even more so, and humans (in my bias opinion) make the most of it and even generate it solely for the purpose of generating information so that others can receive it (i.e art). Notice, I'm still not talking about meaning though, only the transmission and reception of visual information (in the case of art).

Meaning hinges on semantic information, which is when a system has a very, very large memory capacity and begins to classify particular information structures as a "type" (so you have a word for "apple", a semantic designation that comes from several exposures to information that looks similar: that conveying an apple). The redundancy and compression come in. In physics, we start with 100's of equations, shoehorn them into 4 equations (Maxwell's equations) then pressure cook them into a last, final equation using the d'Alembertian. So there's not really a lot of information in that last, final equation. All the information is really in the brain. The last final equation is more-or-less a title which a human can use to unpack (from their own brain) the deeper meanings through derivations (another algorithm, another "blue print", but a procedural one.)

But ultimately, meaning is simply compressed information (semantics or semiotics) that represents a larger set of information (your episodic exposure to the concept). The word itself contains very little information. The neuroethological complex that the word sets of in a human brain is really where the bulk of the information has been stored through iterative exposure and association.

But "blue prints" are essentially a map of the general information. The map tells you the geometry: what goes where. In the universe then, a Newtonian/euclidian map would tell you the position and momentum of every particle, along with a matrix of interactions between all possible pairs of particles (i.e. forces).
 

Similar threads

Back
Top