Need help with Matlab standard deviation

In summary, Kerry said that the standard deviation for a log-normal distribution is not the same as the standard deviation for a normal distribution.
  • #1
spoonyluv
9
0
Hello,

I seriously need some help as I can't figure out what to do here. I am working on coding a vaccination simulation that measures rates of infection for period of 100 days.

I have this variable delta, which is the number of days the infection period lasts. Delta is 5, or at least on average it is 5, which means it could be occasionally be 1 day for a random individual or 3 days for some other random individual or even 7, but on average it is 5.

I have 10 individuals and run the simulation for 100 days. for each individual i need MATLAB to create a random standard deviation that is part of a normal distribution. So for example if I have a lognormal formula

R = lognrd (5, ?, 100, 10)

This formula creates a perfect matrix of 100x10 with lognormal distribution values for each cell, with 5 being the mean. The problem is with the ?, which is where the standard deviation goes. How do I get MATLAB to churn out a std dev of a normal distribution that will then feed into the R = lognrd (5, ?, 100, 10)

Thanks
 
Physics news on Phys.org
  • #2
There is no "standard" standard deviation for a log-normal distribution. See: http://en.wikipedia.org/wiki/Log-normal_distribution. This is another input that you need to give to MATLAB, not the other way around.

I other words, you can have a distribution where 99% of the population has an infection period of 4-6 days, with the other 1% being "other." You could also have a distribution where 50% of the population has an infection period of 4-6 days, with the other half being "other." In both cases it is possible for the distribution to be log-normal.

-Kerry
 
  • #3
Oooh, I think I'm wrong in the above post (or at least I'm not sure I'm right). I responded because I've been looking at normal distributions recently and they were fresh in my mind - but these are NOT the same as a log-normal distribution.

My advice - ignore the above post :-)

Sorry for the confusion...

-Kerry
 

Related to Need help with Matlab standard deviation

What is standard deviation in Matlab?

Standard deviation in Matlab is a measure of the spread of data around its mean. It tells you how much the data varies from the average.

How do I calculate standard deviation in Matlab?

To calculate standard deviation in Matlab, you can use the std function. This function takes in a vector or matrix of data and returns the standard deviation value.

Can I calculate standard deviation for a specific subset of data in Matlab?

Yes, you can use the std function with specific indexing to calculate standard deviation for a subset of data in Matlab.

What is the difference between standard deviation and variance in Matlab?

Standard deviation and variance are both measures of the spread of data in Matlab. The main difference is that standard deviation is measured in the same units as the data, while variance is measured in squared units.

How can I use standard deviation to analyze my data in Matlab?

Standard deviation can be used to analyze your data in Matlab by giving you an idea of how much your data deviates from the mean. It can also help identify outliers or unusual data points that may need further investigation.

Similar threads

  • General Math
Replies
2
Views
819
  • Other Physics Topics
Replies
5
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
4
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
24
Views
3K
Replies
1
Views
2K
  • Precalculus Mathematics Homework Help
Replies
10
Views
2K
  • Calculus and Beyond Homework Help
Replies
2
Views
2K
  • MATLAB, Maple, Mathematica, LaTeX
Replies
2
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
6
Views
1K
Back
Top