Random variables Definition and 351 Threads

In probability and statistics, a random variable, random quantity, aleatory variable, or stochastic variable is described informally as a variable whose values depend on outcomes of a random phenomenon. The formal mathematical treatment of random variables is a topic in probability theory. In that context, a random variable is understood as a measurable function defined on a probability space that maps from the sample space to the real numbers.

A random variable's possible values might represent the possible outcomes of a yet-to-be-performed experiment, or the possible outcomes of a past experiment whose already-existing value is uncertain (for example, because of imprecise measurements or quantum uncertainty). They may also conceptually represent either the results of an "objectively" random process (such as rolling a die) or the "subjective" randomness that results from incomplete knowledge of a quantity. The meaning of the probabilities assigned to the potential values of a random variable is not part of probability theory itself, but is instead related to philosophical arguments over the interpretation of probability. The mathematics works the same regardless of the particular interpretation in use.
As a function, a random variable is required to be measurable, which allows for probabilities to be assigned to sets of its potential values. It is common that the outcomes depend on some physical variables that are not predictable. For example, when tossing a fair coin, the final outcome of heads or tails depends on the uncertain physical conditions, so the outcome being observed is uncertain. The coin could get caught in a crack in the floor, but such a possibility is excluded from consideration.
The domain of a random variable is called a sample space, defined as the set of possible outcomes of a non-deterministic event. For example, in the event of a coin toss, only two possible outcomes are possible: heads or tails.
A random variable has a probability distribution, which specifies the probability of Borel subsets of its range. Random variables can be discrete, that is, taking any of a specified finite or countable list of values (having a countable range), endowed with a probability mass function that is characteristic of the random variable's probability distribution; or continuous, taking any numerical value in an interval or collection of intervals (having an uncountable range), via a probability density function that is characteristic of the random variable's probability distribution; or a mixture of both.
Two random variables with the same probability distribution can still differ in terms of their associations with, or independence from, other random variables. The realizations of a random variable, that is, the results of randomly choosing values according to the variable's probability distribution function, are called random variates.
Although the idea was originally introduced by Christiaan Huygens, the first person to think systematically in terms of random variables was Pafnuty Chebyshev.

View More On Wikipedia.org
  1. rcktbr

    Getting the probability distribution of a random variable

    X and Y are discrete random variables with the following joint distribution: a) Calculate the probability distribution, mean, and variance of Y. My attempt: I have calculated the probability for different values of Y and X using the following equation: ##\text{Pr(Y = y)}## = ##\sum_{i=1}^l##...
  2. cianfa72

    A Karhunen–Loève theorem expansion random variables

    Hi, in the Karhunen–Loève theorem's statement the random variables in the expansion are given by $$Z_k = \int_a^b X_te_k(t) \: dt$$ ##X_t## is a zero-mean square-integrable stochastic process defined over a probability space ##(\Omega, F, P)## and indexed over a closed and bounded interval ##[a...
  3. D

    B Can I replace ##X_n = i## with ##A## to type less? Rules of math.

    When working with random variables, it is tempting to make substitutions with placeholders, by writing writing ##A## instead of ##X_n=i##, because it greatly simplifies the look. It seems that if ##A## has all of the attributes of the equation ##X_n=I##, then such substitutions should be allowed...
  4. F

    I Statistical modeling and relationship between random variables

    In statistical modeling, the goal is to come up with a model that describes the relationship between random variables. A function of randoms variables is also a random variable. We could have three random variables, ##Y##, ##X##, ##\epsilon## with the r.v. ##Y## given by ##Y=b_1 X + b_2 +...
  5. A

    B Definition of a random variable in quantum mechanics?

    In a line of reasoning that involves measurement outcomes in quantum mechanics, such as spins, photons hitting a detection screen (with discrete positions, like in a CCD), atomic decays (like in a Geiger detector counting at discrete time intervals, etc.), I would like to define rigorously the...
  6. A

    I The covariance of a sum of two random variables X and Y

    Suppose X and Y are random variables. Is it true that Cov (Z,K) = Cov(X,K)+Cov(Y,K) where Z=X+Y?
  7. WMDhamnekar

    I Expected number of random variables that must be observed

    In my opinion, answer to (a) is ## \mathbb{E} [N] = p^{-4}q^{-3} + p^{-2}q^{-1} + 2p^{-1} ## In answer to (b), XN is wrong. It should be XN=p-4q-3 - p-3 q-2- p-2 q-1 - p-1. This might be a typographical error. Is my answer to (a) correct?
  8. F

    I Linear regression and random variables

    Hello, I have a question about linear regression models and correlation. My understanding is that our finite set of data ##(x,y)## represents a random sample from a much larger population. Each pair is an observation in the sample. We find, using OLS, the best fit line and its coefficients and...
  9. A

    Probability involving Gaussian random sequences

    How do I approach the following problem while only knowing the PSD of a Gaussian random sequence (i.e. I don't know the exact distribution of $V_k$)? Or am I missing something obvious? Problem statement: Thoughts: I know with the PSD given, the autocorrelation function are delta functions due...
  10. Euge

    POTW Convergence of Random Variables in L1

    Let ##\{X_n\}## be a sequence of integrable, real random variables on a probability space ##(\Omega, \mathscr{F}, \mathbb{P})## that converges in probability to an integrable random variable ##X## on ##\Omega##. Suppose ##\mathbb{E}(\sqrt{1 + X_n^2}) \to \mathbb{E}(\sqrt{1 + X^2})## as ##n\to...
  11. A

    Help with random variable linear estimation

    Hi all, I have a problem on linear estimation that I would like help on. This is related to Wiener filtering. Problem: I attempted part (a), but not too sure on the answer. As for unconstrained case in part (b), I don't know how to find the autocorrelation function, I applied the definition...
  12. A

    MSE estimation with random variables

    Hello all, I would appreciate any guidance to the following problem. I have started on parts (a) and (b), but need some help solving for the coefficients. Would I simply take the expressions involving the coefficients, take the derivative and set it equal to 0 and solve? I believe I also need...
  13. C

    I Randomly Stopped Sums vs the sum of I.I.D. Random Variables

    I've came across the two following theorems in my studies of Probability Generating Functions: Theorem 1: Suppose ##X_1, ... , X_n## are independent random variables, and let ##Y = X_1 + ... + X_n##. Then, ##G_Y(s) = \prod_{i=1}^n G_{X_i}(s)## Theorem 2: Let ##X_1, X_2, ...## be a sequence of...
  14. A

    MSE estimation with random variables

    Hello all, I am wondering if my approach is correct for the following problem on MSE estimation/linear prediction on a zero-mean random variable. My final answer would be c1 = 1, c2 = 0, and c3 = 1. If my approach is incorrect, I certainly appreciate some guidance on the problem. Thank you...
  15. A

    Determining stationary and mean-ergodicity

    I am having difficulties setting up and characterizing stationary and ergodicity for a few random processes below. I need to determine whether the random process below is strict-sense stationary (SSS), whether it is wide-sense stationary (WSS), and whether it is ergodic in the mean. All help is...
  16. A

    Sinusoidal sequences with random phases

    Hello all, I have a random sequences question and I am mostly struggling with the last part (e) with deriving the marginal pdf. Any help would be greatly appreciated. My attempt for the other parts a - d is also below, and it would nice if I can get the answers checked to ensure I'm...
  17. A

    Break a Stick Example: Random Variables

    Hello, I would like to confirm my answers to the following random variables question. Would anyone be willing to provide feedback and see if I'm on the right track? Thank you in advance. My attempt:
  18. A

    Probability: pair of random variables

    Hello all, I would like to check my understanding and get some assistance with last part of the following question, please. For part (d), would I use f(x | y) = f(x, y) / f(y) ? Problem statement: My attempt at a solution, not too confident in my set-up for part (d). I drew a sketch of the...
  19. A

    Probability/Random variables question

    Hello all, I am wondering if my approach is coreect for the following probability question? I believe the joint PDF would be 1 given that the point is chosen from the unit square. To me, this question can be reduced down to finding the area of 1/4 of a circle with radius 1. Any help is appreciated!
  20. M

    Probability that 𝑌>3𝑋 where 𝑋,𝑌 are 𝑁(0,1) random variables

    After plotting the above (not shown) I believe one way (the hard way) to solve this problem is to compute the following integral where ##f(x) = e^{-x^2/2}/\sqrt{2\pi}##: $$\frac{\int_0^\infty \int_{3X}^\infty f(X)f(Y)\, dydx + \int_{-\infty}^0 \int_0^\infty f(X)f(Y)\...
  21. Steve Zissou

    I Distribution of Sum of Two Weird Random Variables....

    Hi there. Let's say I have the following relationship: x = a + b*z + c*y z is distributed normally y is distributed according to a different distribution, say exponential Is there a way to figure out what is the distribution of x? Thanks!
  22. L

    Finding the distribution of random variables

    Hi. I have found the answer to a and c (I don't know whether it is correct) but I do not know what I should find in question b. Is my method correct and how should I solve part b? Thank you for your help!
  23. K

    Using Poisson random variables to calculate this probability

    I calculated the mean which is 78.4 And the Standard deviation is 5.6 I thought the answer would be (90^(-78.4)/78.4!)*e^-90 But looking back having a decimal factorial doesn't make sense I have the numerical answers for c)= 0.019226 and d)=0.022750 but I my solution was wrong. Any help on...
  24. D

    Calculate the joint CDF of two random variables

    $$f_{XY}=1$$ $$dzdy=2xdxdy⇒\frac{1}{2\sqrt{z}}dzdy=dxdy$$ $$f_{ZY}=\frac{1}{2\sqrt{z}}\quad \text{on some region S}$$ $$F_{ZY}=\int^y_{g}\int^x_{h}\frac{1}{2\sqrt{z}}dzdy\quad\text{for some}\quad g(x,y),h(x,y)$$ im learning how to find the region S using a change-of variables technique
  25. D

    Are X and Y dependent random variables?

    (a) the agrea of the triangleses is 1, so γ one. (b) I'm not sure how to prove. i feel like ##X## and ##Y## are dependent because ##E(Y|X=0)=\frac{1}{2}## and ##E(Y|X=1)=0## so ##Y## seems dependent on ##X##. ##f_X=1-x## for ##x>0## ane ##f_X=1+x## for ##x<0## so X seems independent on Y.
  26. M

    Bound correlation coefficient for three random variables

    Hi, I just found this problem and was wondering how I might go about approaching the solution. Question: Given three random variables ## X##, ##Y##, and ## Z ## such that ##\text{corr}(X, Y) = \text{corr}(Y, Z) = \text{corr}(Z, X) = r ##, provide an upper and lower bound on ##r## Attempt: I...
  27. B

    Mixed random variables problem

    I got (a) and (b) but I'm still working on (c). The solutions can be found here for your reference: https://ocw.mit.edu/courses/electrical-engineering-and-computer-science/6-041sc-probabilistic-systems-analysis-and-applied-probability-fall-2013/unit-ii/lecture-9/MIT6_041SCF13_assn05_sol.pdf. But...
  28. P

    I Find P(X+Y>1/2) for given joint density function

    Hey everybody, :smile: I have a joint density of the random variables ##X## and ##Y## given and want to find out ##P(X+Y>1/2)##. The joint density is as follows: $$f_{XY}(x,y) = \begin{cases}\frac{1}{y}, &0<x<y,0<y<1 \\ 0, &else \end{cases}$$ To get a view of this I created a plot: As...
  29. S

    Prob/Stats Material on complex random variables and exotic probabilities

    I am looking for books that have sections or even chapters devoted to complex random variables, or random variables that can take on complex values (NOT probabilities that are valued in the complex range, in this regard). On the other hand, if someone does know any books that contain material on...
  30. Armine

    Proof of a formula with two geometric random variables

    The image above is the problem and the image below is the solution I have tried but failed.
  31. U

    MHB Verifying Solution for Exponentially Distributed Random Vars.

    Given two i.i.d. random variables $X,Y$, such that $X\sim \exp(1), Y \sim \exp(1)$. I am looking for the probability $\Phi$. However, the analytical solution that I have got does not match with my simulation. I am presenting it here with the hope that someone with rectifies my mistake. ...
  32. archaic

    Linear combination of random variables

    a) Total weight ##W=W_1+W_2+...+W_{25}##.$$E[W]=E[W_1]+E[W_2]+...+E[W_{25}]=25\times76=1900\,kg$$$$\sigma_W=\sqrt{V(W_1)+V(W_2)+...+V(W_{25})}=\sqrt{25\times(16)^2}=80\,kg$$ b) Since ##W## is a linear combination of normal distributions, the reproductive property tells us that ##W## is also...
  33. TheBigDig

    Sum of the Expected Values of Two Discrete Random Variables

    Apologies if this isn't the right forum for this. In my stats homework we have to prove that the expected value of aX and bY is aE[X]+bE[Y] where X and Y are random variables and a and b are constants. I have come across this proof but I'm a little rusty with summations. How is the jump from the...
  34. D

    Three independent random variables having Normal distribution

    Let ##X_1 X_2 X_3 ## be three independent random variables having Normal(Gaussian ) distribution, all with mean ##\mu##=20 and variance ##\sigma^2##=9. Also let ##S=X_1+ X_2 +X_3## and let ##N## be the number of the ##X_i## assuming values greater than 25. ##E\left[N\right]##=? I did not...
  35. D

    Expected value of two uniformly distributed random variables

    ##X_1## and## X_2## are uniformly distributed random variables with parameters ##(0,1)## then: ##E \left[ min \left\{ X_1 , X_2 \right\} \right] = ## what should I do with that min?
  36. D

    Uniform distribution of two random variables

    i did not get how the professor came to such result. In particular: in order to evaluate P[x+y<=z] solved a double integral of the joint density. What i am not getting is did i choose the extreme of integration in order to get as result ##\frac {z^2} {2}##
  37. C

    I Expected Value of 2^X and 2^-X for Geometric and Poisson Distributions?

    For the following distributions find $$E[2^X]$$ and $$E[2^{-X}]$$ if finite. In each case,clearly state for what values of the parameter the expectation is finite. (a) $$X\sim Geom(p)$$ (b) $$X\sim Pois(\lambda)$$ My attempt: Using LOTUS and $$E[X]=\sum_{k=0}^{\infty}kP(X=k)=\frac{1-p}{p}$$...
  38. C

    MATLAB Problem with random variables in Matlab's PCA

    Hello. I have designed a Gaussian kernel as: [X,Y] = meshgrid(0:0.002:1,0:0.002:1); Z=exp((-1)*abs(X-Y)); Now, I calculate PCA: [coeffG, scoreG, latentG, tsquaredG, explainedG, muG]=pca(Z, 'Centered',false); I can rebuid the original data propperly as defined in the dcumentation...
  39. WMDhamnekar

    MHB Distribution and Density functions of maximum of random variables

    1] Let X,Y,Z be independent, identically distributed random variables, each with density $f(x)=6x^5$ for $0\leq x\leq 1,$ and 0 elsewhere. How to find the distributon and density functions of the maximum of X,Y,Z.2]Let X and Y be independent random variables, each with density $e^{-x},x\geq...
  40. Boltzman Oscillation

    How can I determine the random variables for this problem?

    So i first need to come up with the sample space, X, and Y. Well I would guess that the random variables here are N1 and N2 and thus X = N1 and Y = N2. Now i need to make these random variables a function of L. I don't know what L should be but I would guess it is the outcome of a 1ms interval...
  41. WMDhamnekar

    MHB Check Martingale Sequences from i.i.d. Variables | Stats SE

    How to answer this question $\rightarrow$https://stats.stackexchange.com/q/398321/72126
  42. WMDhamnekar

    MHB Joint probability distribution of functions of random variables

    If X and Y are independent gamma random variables with parameters $(\alpha,\lambda)$ and $(\beta,\lambda)$, respectively, compute the joint density of U=X+Y and $V=\frac{X}{X+Y}$ without using Jacobian transformation. Hint:The joint density function can be obtained by differentiating the...
  43. WMDhamnekar

    MHB Two normal independent random variables

    Let X and Y be independent normal random variables each having parameters $\mu$ and $\sigma^2$. I want to show that X+Y is independet of X-Y without using Jacobian transformation. Hint given by author:- Find their joint moment generating functions. Answer: Now Joint MGf of...
  44. P

    MHB Calculation of probability with arithmetic mean of the sum of random variables

    Calculation of probability with arithmetic mean of random variables There are 4 people, each of whom has one deck of cards with 500 cards that are numbered from 1 to 500 with no duplicates. Each person draws a card from his deck and I would like to calculate the probability of the event that...
  45. S

    MHB How Do You Calculate the Expected Value and Variance of Yi in a Noisy Image?

    Hello I have this following question and I am wondering if i am on the right path : here is the question A picture in which pixel either takes 1 with a prob of q and 0 with a prob of 1-q, where q is the realized value of a r.v Q which is uniformly distributed in interval [0,1] Let Xi be the...
  46. M

    I Pdf of Difference of Random Variables

    I want to find the probability density function (pdf) of the difference of two RV's, p_{\Delta Y} = p_{(Y_1 - Y_2)},where y = \sin \theta, and where \theta_1 and \theta_2 are random variables with the same uniform distribution p_{\theta}=\mathrm{rect}\left(\frac{\theta}{\pi}\right). This has...
  47. F

    Calculating the covariance of two discrete random variables

    Homework Statement If the random variables T and U have the same joint probability function at the following five pairs of outcomes: (0, 0), (0, 2), (-1, 0), (1, 1), and (-1, 2). What is the covariance of T and U? Homework Equations σxy = E(XY) - μx⋅μy The Attempt at a Solution My issue with...
  48. binbagsss

    Characteristic function of the sum of random variables

    Homework Statement I am trying to understand the very last equality for (let me replace the tilda with a hat ) ##\hat{P_{X}(K)}=\hat{P(k_1=k_2=...=k_{N}=k)}##(1) Homework Equations I also thought that the following imaginary exponential delta identity may be useful, due to the equality of...
  49. Simonel

    Show Standard Deviation is Zero When X=k

    Show that The standard deviation is zero if and only if X is a constant function,that is ,X(s) = k for every s belonging to S,or ,simply X=k. When they say constant function it means every element in S is been mapped to single element in the range.That is the single element is k. Which means...
Back
Top