Relative Entropy or Kullback Leibler divergence

AI Thread Summary
The discussion focuses on calculating the relative entropy, or Kullback-Leibler divergence, between two frequency occurrence matrices representing sets of data. Set 1 is a base matrix, while Set 2 includes a mutation in one character, affecting only one row. The user expresses frustration with the convoluted online resources that lack clear definitions of variables. Ultimately, the user resolves their confusion and states they have figured out the solution independently. The conversation highlights the challenges of understanding relative entropy in the context of data mutation.
bowlbase
Messages
145
Reaction score
2

Homework Statement


I am suppose to calculate the relative entropy between two sets of data:
Base set
Set 1:
A C G T
0 0 0 10
0 0 0 10
0 0 10 0
0 10 0 0
10 0 0 0
* * * * //Randomized
0 0 0 10
0 10 0 0

Set 2:
A C G T
0 0 0 10
0 0 0 10
0 0 10 0
0 10 0 0
10 0 0 0
1 4 1 4
0 0 0 10
0 10 0 0These are frequency of occurrence matrices. Set 2 is a matrix created after a variable number of characters is mutated. In this case only 1 character in the 3rd from bottom row was mutated. Thats why this row has no 10s. Every other position didn't mutate so has the correct number of occurrences as compared to set 1. I have 70 other sets of this data with various number of mutations and lengths.

I am trying to read about this online but the information is convoluted and often seems to actively avoid defining variables. Can someone walk me through the process?

Homework Equations

The Attempt at a Solution

 
Physics news on Phys.org
Nevermind, I've got it!
 
Back
Top