Statistics: Proofs and Problems for Random Variables and their Distributions

In summary: Therefore, the expected value of Y^2 is 0. Finally, we can plug this into our formula for the variance to get: Var[Y] = 0 - (1)^2 = -1. In summary, the first question deals with expanding an algebraic equation and solving for p, while the second question involves using the formula for the expected value of a geometric random variable and the second derivative of an infinite sum to find the variance.
  • #1
StopWatch
38
0

Homework Statement



Before I get started here I have one really quick basic question:

Lets say I want the probability that an survives two hours, and that the probability an engine will fail in any given hour is .02. Then I can get 1 - .02 - .98(.02) = .9604. This is found by a geometric distribution, 1 - the sum of q^1-1 p - q^2-1 p.

This got me thinking though, if I have (q)^2 = (1-p)^2 this gives me (.98)^2. Is this not equal to 1 - 2p + p^2? I feel like I'm forgetting something basic about order of operations but if I haven't plugged in p yet I don't see why this applies. Why can't I expand then solve rather than solve then expand (though strictly in this case I have nothing more to expand).


P(Y) = q (= 1-p)^Y-1 (p)


My other question is more interesting I suppose: Find E[Y(Y-1)] for a geometric random variable Y by finding the second derivative of the infinite sum (from 1 to infinity) of q^y. Use this result to find the variance. I have no idea where to begin on this question to be honest.
 
Physics news on Phys.org
  • #2
Any guidance would be greatly appreciated. Thanks.Homework Equations P(Y)= q (= 1-p)^Y-1 (p) The Attempt at a Solution For the first question, yes, you can expand and then solve. The order of operations does not matter here because you are dealing with algebraic equations. You can expand the equation to get: 1 - 2p + p^2 = (1 - p)^2 which is the same as 1 - p = (1 - p)^2 so you can solve for p to get p = 1. For the second question, we can use the formula for the expected value of a geometric random variable: E[Y] = \frac{q}{p} So, the expected value of Y(Y-1) is: E[Y(Y-1)] = E[Y^2] - E[Y] Now, we can compute the variance of Y, which is the expected value of Y^2 minus the square of the expected value of Y: Var[Y] = E[Y^2] - (E[Y])^2 To compute the expected value of Y^2, we can use the formula for the second derivative of an infinite sum: \frac{d^2}{dx^2} \sum_{n=1}^{\infty} a_nx^n = \sum_{n=3}^{\infty} n(n-1)a_nx^{n-2} So, for our case, we can set a_n = q^n and x = q, to get: \frac{d^2}{dq^2} \sum_{n=1}^{\infty} q^{n} = \sum_{n=3}^{\infty} n(n-1)q^{n-2} We can compute this sum by taking the limit as n goes to
 

FAQ: Statistics: Proofs and Problems for Random Variables and their Distributions

1. What is the purpose of studying random variables and their distributions in statistics?

The purpose of studying random variables and their distributions in statistics is to understand and analyze the patterns and behaviors of data that are subject to randomness. This allows us to make informed decisions and predictions based on the data.

2. What are the main types of random variables?

The main types of random variables are discrete and continuous. Discrete random variables take on a finite or countably infinite number of values, while continuous random variables can take on any value within a specific range.

3. How are random variables and their distributions related?

Random variables and their distributions are closely related because the distribution of a random variable describes the probability of each possible outcome occurring. In other words, the distribution helps us understand how likely it is for a specific value to be observed for a given random variable.

4. What is the difference between a probability density function (PDF) and a cumulative distribution function (CDF)?

A probability density function (PDF) is a function that describes the probability of a continuous random variable taking on a specific value. A cumulative distribution function (CDF) is a function that describes the probability of a random variable being less than or equal to a specific value.

5. How are probabilities calculated for random variables and their distributions?

Probabilities for random variables and their distributions are calculated using mathematical formulas and techniques, such as the binomial distribution, normal distribution, and Poisson distribution. These formulas take into account the characteristics of the random variable, such as its type and parameters, to determine the probability of a specific outcome or range of outcomes occurring.

Back
Top