In probability and statistics, a random variable, random quantity, aleatory variable, or stochastic variable is described informally as a variable whose values depend on outcomes of a random phenomenon. The formal mathematical treatment of random variables is a topic in probability theory. In that context, a random variable is understood as a measurable function defined on a probability space that maps from the sample space to the real numbers.
A random variable's possible values might represent the possible outcomes of a yet-to-be-performed experiment, or the possible outcomes of a past experiment whose already-existing value is uncertain (for example, because of imprecise measurements or quantum uncertainty). They may also conceptually represent either the results of an "objectively" random process (such as rolling a die) or the "subjective" randomness that results from incomplete knowledge of a quantity. The meaning of the probabilities assigned to the potential values of a random variable is not part of probability theory itself, but is instead related to philosophical arguments over the interpretation of probability. The mathematics works the same regardless of the particular interpretation in use.
As a function, a random variable is required to be measurable, which allows for probabilities to be assigned to sets of its potential values. It is common that the outcomes depend on some physical variables that are not predictable. For example, when tossing a fair coin, the final outcome of heads or tails depends on the uncertain physical conditions, so the outcome being observed is uncertain. The coin could get caught in a crack in the floor, but such a possibility is excluded from consideration.
The domain of a random variable is called a sample space, defined as the set of possible outcomes of a non-deterministic event. For example, in the event of a coin toss, only two possible outcomes are possible: heads or tails.
A random variable has a probability distribution, which specifies the probability of Borel subsets of its range. Random variables can be discrete, that is, taking any of a specified finite or countable list of values (having a countable range), endowed with a probability mass function that is characteristic of the random variable's probability distribution; or continuous, taking any numerical value in an interval or collection of intervals (having an uncountable range), via a probability density function that is characteristic of the random variable's probability distribution; or a mixture of both.
Two random variables with the same probability distribution can still differ in terms of their associations with, or independence from, other random variables. The realizations of a random variable, that is, the results of randomly choosing values according to the variable's probability distribution function, are called random variates.
Although the idea was originally introduced by Christiaan Huygens, the first person to think systematically in terms of random variables was Pafnuty Chebyshev.
Homework Statement
Let \psi(x) = 2\phi(x) - 1. The function \psi is called the positive normal distribution. Prove that if Z is standard normal, then |Z| is positive normal.
Homework Equations
The Attempt at a Solution
I am not really sure where to begin with this. Can anyone...
Hi. The question is:
Given X, Y and Z are all continuous, independant random variables uniformly distributed on (0,1), prove that (XY)^Z is also uniformly distributed on (0,1).
I worked out the pdf of XY=W. I think it's -ln(w). I have no idea at all how to show that W^Z is U(0,1).
What...
Homework Statement
A communications channel transmits the digits 0 and 1. However, due to static, the digit transmitted is incorrectly received with probability 0.2. Suppose that we want to transmit an important message consisting of one binary digit. To reduce the chance of error, we...
Homework Statement
\Omega is a set of points \omega ; C_{i} i = 1, 2, ... 7 are subsets of \Omega;
and ( \Omega, F, P) = (B_{i}, i/10, i = 1, 2, 3, 4 ) is a probability modal
with B_{1} = C_{1} \cup C_{7}, B_{2} = C_{2} \cup C_{6}, B_{3} = C_{3} \cup C_{5} and B_{4} = C_{4}.
State...
"Factorizing" random variables
Suppose we have a (discrete) random variable X. Consider a random variable Y "equivalent" to X if there are functions f, g such that X = f(Y) and Y = g(X). Among other things, this implies H(Y) = H(X).
Y = Y1 x Y2 x ... x Yn, where x is the cartesian product...
Hello,
Can somebody pls explain to me what is the difference between generating random numbers and random variables. The confusion is mainly because most of the time texts write that for n (iid) random variables in the limiting sense reaches the expectation of the first random variable.
I...
Homework Statement
Suppose X is a discrete random variable with probability mass function
pX(x)=1/5, if x=-2,-1,0,1,2
pX(x)=0, otherwise
Let Y=X2. Are X and Y independent? Prove using definitions and theorems.
Homework Equations
The Attempt at a Solution
The random variables X and Y...
I am studying for the FRM and there is a question concerning the captioned. I try to start off by following the standard Expectation calculation and breakdown the pdf into Bayesian Conditional Probability function. Then i got stuck there. Anyone can help me to find a proof on it? Many thanks.
Let X and Y be two random variables.
Say, for example, they have the following joint probability mass function
x
-1 0 1
-1 0 1/4 0
y 0 1/4 0 1/4
1 0 1/4 0
What is the proper way of computing E(XY)?
Can I let Z=XY and find E(Z)=∑...
If Z1,Z2...Zn are standard normal random variable that are identically and independently distrubuted, then how can one prove that squaring and summing them will produce a Chi-
squared random variable with n degrees of freedom.
Any help on this will be greatly appreciated. I am new to this...
Homework Statement
An ambulance travels back and forth, at a constant speed, along a road of length
L. At a certain moment of time an accident occurs at a point uniformly distributed on the
road. (That is, its distance from one of the fixed ends of the road is uniformly distributed
over...
Suppose I have a sample X_1, ..., X_n of independently, identically distributed exponential random variables.
One result I deducted was that the ratio of any two of them (eg. X_1 / X_2) is independent of the sample average 1/n * \sum_{i=1}^{n} X_i.
(Aside: that ratio, as a random variable...
Man I hate probability...anyhow could some help me with this Q as I am not understanding how to set it up...
Suppose that the force acting on a column which helps to support a building is normally distributed with mean 15.0 kips and standard deviation 1.25 kips:
What is the probability...
Q: If X_1 and X_2 are independent exponential random variables with respective parameters \lambda_1 and \lambda_2, find the distribution of Z = X_1 / X_2.
Discussion
The best method to attack this problem apparent to me is coming up with a cumulative distributive function for Z and then...
Hi, Guys,
I'm new to this forum, and don't have strong background in probability theory, so please bare with me if the question is too naive.
Here's the question,
In a problem I'm trying to model, I have a random variable (say, R), which is a sum of random number (say, N) of random variables...
Pdf (or mgf) of maximum of dependent exponential random variables ?
max of Z1, Z2, Z3, Z4
where
Z1 = |X1+X2+X3|^2 + |Y1+Y2+Y3|^2
Z2 = |X1-X2+X3|^2 + |Y1-Y2+Y3|^2
Z3 = |X1+X2-X3|^2 + |Y1+Y2-Y3|^2
Z4 = |X1-X2-X3|^2 + |Y1-Y2-Y3|^2
Xi, Yi are independent zero mean normal with...
Hi all,
I want to find maximum of two random variables which are correlated and are non gaussian too. Baiscally I need an analytical orr approximate solution to their bivaraite distribution with means and varaince of resulting distribution. There is some work by Clark 'The greatest of finite...
Homework Statement
Let X_1, \ldots, X_6 be a sequence of independent and identically distributed continuous random variables. Find
(a) P\{X_6 > X_1 \, | \, X_1= \max(X_1, \ldots, X_5)\}
(b) P\{X_6 > X_2 \, | \, X_1 = \max(X_1, \ldots, X_5)\}
The attempt at a solution
(a) is the...
a discrete random variable has range space {1, 2, ..., n} and satisfies P(X=j) = j/c for some number c. Find c, and then find E(X), E(X^2), E(1/X) and Var(X).
thanks
For two independent and identically distributed random variables having the exponential distribution, do they have the same lambda value, or are the lambda values different?
Hi
I have a question regarding i.i.d. random variables. Suppose X_1,X_2,\ldots is sequence of independent and identically distributed random variables with probability density function f_{X}(x), mean = \mu and variance = \sigma^2 < \infty.
Define
Y_{n} = \frac{1}{n}\sum_{i=1}^{n}X_{i}...
Homework Statement
A discrete random variable X has the following PMF
x | 1 | 2 | 3 | 4 | 5 |
p(x)|1-a|1-2a|0.2| a | 0.5a|
What are the values of "a" that are allowed in this PMF?
For the allowed values, compute the expected value and the standard deviation of the variable...
Hi all,
assume we have a sample space with 2^n points. (it size is 2^n for some natural n)
I need to prove that the maximal number of independent binary (indicator) random variables (which are not trivial, i.e. constant) is n...
Thnks,
Pitter
Hi,
Another question...
I know that the minimum of n i.i.d \lambda-exponentially distributed random variables is again exponentially distributed (with parameter n\lambda). Is something similar true for \Gamma(k,\theta) ...? that is, is the minimum of n i.i.d Gamma distributed random variables...
I have two independent standard normal random variables X1,X2. Now I want to construct two new normal random variables Y1,Y2 with mean\mu1, \mu2 and variance (\sigma1)^2, (\sigma2)^2 and correlation \rho.
How do I approach this problem?
Can anyone tell me how to find the joint PDF of two random variables? I can't seem to find an explanation anywhere. I'm trying to solve a problem but I'm not sure where to go with it:
Y is an exponential random variable with parameter \lambda=4. X is also an exponential random variable and...
Homework Statement
A random variable has distribution function F(z) = P(y<= z) given by (this is a piecewise function)
f(z) =
0 if z < -1
1/2 if -1 <= z < 1
1/2 + 1/4(z-1 if 1 <= z < 2
1 if 2 <= z
What is P(Y = 2)?
Find all the numbers t with the property that both P(Y <= t) >=...
Homework Statement
A random variable has a distribution function F(z) given by
F(z) = 0 if z< -1
F(z) = 1/2 if -1 <= z < 2
F(z) = (1-z^{-3}) is 2 <= z
Find the associated mean and variance.
The Attempt at a Solution
I drew the distribution function. I started with the associated...
Homework Statement
If X is represented by the Gaussian distribution, that is,
f_{X}(x) = \frac{1}{\sigma\sqrt{2\pi}} \exp{(-\frac{x^2}{2\sigma^2})}
find an expression for the pdf fZ(z) of Z = arctan(x).
The Attempt at a Solution
If Z =g(X), then g(X) is multivalued unless the range of...
Dear all,
I wonder wheather there exsits a probability inequality for the sum of independent normal random variables (X_i are i.i.d. normal random varianble with mean \mu and variance \sigma^2):
P\left(\frac{1}{n}\sum_{i=1}^n X_i - \mu> \epsilon\right)\leq
f(\epsilon, \sigma^2,n) \right).
We...
Hi fellow members, I would appreciate if you could help with the following problem, it has had me stumped!
Prove the statistical distance between random variables X & Y
Thank You, and have a great day!
I have two random variables X and Y. Now the distribution of X and Y, is a bit complicated. Basically they follow Gamma distributions, X=\Gamma(k1,\theta) and Y=\Gamma(k2,\theta), but k1 and k2 are Poisson distributed. But I do have a closed form expression for the distribution of X and Y, and...
Hi,
What is meant by "convergence of random variables"? Specifically, this statement confuses me:
The sequence of random variables X_1, X_2, ... , X_n is said to converge in probability to the constant c if for any \epsilon > 0:
\lim_{n \rightarrow \infty} P(\vert X_n - c \vert <...
Suppose I had n random variables, all of which have the same distribution but different mean and variances. How can I formally describe the distribution of these n random variables.
Well I'm getting pretty frustrated by this problem which arose in my research, so I'm hoping someone here might set me on the right track.
I start with n random variables x_i, i=1..n each independently normally distributed with mean of 0 and variance 1.
I now have two different functions...
A multiple choice test contains 12 questions, 8 of which have 4 answers each to choose from and 4 of which have 5 answers to choose from. If a student randomly guesses all of his answers, what is the probability that he will get exactly 2 of the 4 answer questions correct and at least 3 of the 5...
I have been working on this problem and can't seem to get the answer.
Problem:
X is a continuous random variable with a proabaility density function:
f(x) = 1/4 if -2<=x<=2
0 other wise
Let Y=1/X. Then P(Y<=1/2) = ?
This is how I approached the problem...
Let S={1,2,3,4,5,6}, F=σ(A1,A2), ie., the σ-algebra generated by A1 and A1 (the smallest σ-algebra containing A1 and A2) with A1={1,2,3,4} and A2={3,4,5,6}. Please complete the following:
a. List all sets in F
b. Is the random variable X(w)= 2, w=1,2,3,4; X(w)=7, w=5,6 measurable w.r.t. F...
I've been working on a problem and was wondering if someone could check and see if I am on the right track.
A company produces gas from two plants, A and B. (both are considered to be continuous randm variables; X and Y respectively)
For Plant A, its probability density function is...
anyone's help would be really appreciated. I can't figure out that one.
If X and Y are joint random variables, what is the joint distribution funtion of U=min(X,Y) and V=max(X,Y).
I got something like 2[u(v-u) + ½u^2)]
then how do i worked towards and expression for the marginal...
Hi
I'm wondering if someone can help me out on this question as to how to go about doing it:
X_1, X_2... X_7 are independent random variables represnting a random sample of size 7 from the normal N(10, 7) distribution. Find to 3 dp probablitity that the sample total exceeds 88.
I tried to...
I have a question about independent random variable:
Let say we flip a fair coin, the set of outcome is S={H,T}, P(H)=1/2, P(T)=1/2. Define random variable X:S->R by X(H)=1, X(T)=-1.
From what I read in books, I can define X1 and X2 as independent identically distributed (iid) random variables...
Hi, I really need help with joint PDF, if anyone can help, that would be super! :smile:
Random Variables X and Y have joint PDF
fx,y (x, y) = 1/2 if -1 <= x <=y <= 1, and it is 0 otherwise
a) what is fy (y)?
b) what is fx|y (x|y)?
c) what is E[X|Y = y]?
Hi I need some help. I don't think I did any of this right.
A small business just leased a new computer and color laser printer for three years. The service contract for the computer offers unlimited repairs for a fee of $100 a year plus a $25 service charge for each repair needed. The...
Hello (first time poster),
i am having quite a bit of trouble with a particular problem on stats (which i despise of!) - in particular, discrete random variables.Ok here is the question:
"Find the probability distribution of X in each of the following questions ...
Two fair dice...
I hope someone can help me understand functions of random variables:
If X~Uniform(A,B), A < X < B
Y~Normal(0,1), -inf < Y < inf
and Z = X + Y
- what is the pdf of Z?
- how can I calculate a probability like P(Z < 3)?
- what is the conditional probability P(Z<z | X = x)?
- what is the...
Hello, I need some help on the independence of random variables...
"How do I prove that if X and Y are two independent random variables, then U=g(X) and V = h(Y) are also independent?"
- Isn`t that if random variables X and Y are independent, it implies
that f(x,y) = g(x)h(y) and vice...
Consider the experiment of tossing a die thrice. X is defined as the number of different faces that appear (i.e., X = 1,2,3). What is meant by the "number of different faces that appear"? Could you help me how could I get P(X = 1,2,3)?