Probability Density Function, prove it

In summary, the conversation is discussing how to prove that the function 1/x^2 for x>0 is a probability density function. The main criteria for a PDF is that the integral of the function must equal 1 and the function must not be negative for all x. The conversation also mentions that the integral of 1/x^2 is -1/x, but there is difficulty in explaining how it satisfies the first criteria.
  • #1
randy27
9
0

Homework Statement


This is my 1st post here, so I will do my best. The following question is part of a number of probability density functions that I have to prove. Once I have the hang of this I should be good for the rest, here is the question:

Prove that the following functions are probability density functions:

1/x^2 , x>0

Homework Equations





The Attempt at a Solution



As I understand to prove a probability density function it must satisfy

1. integral of f(x)dx=1
and
2. must not be negative f(x) for all x


I integrate the function of 1/x^2 which is -1/x but I find it tricky to explain myself on how f(x)dx=1


I would be greatfull on pointers on how to prove that the functions is a PDF in a clear manner.
 
Physics news on Phys.org
  • #2
You probably would have gotten a response quicker, but you posted this in the pre-calculus forum...

As I understand it, the integral needs to be = 1. (And, all values of f(x) > 0 which is your point 2.) However, I'm having trouble getting one when I integrate the function. Are you sure it's supposed to be x>0, and not x>1? Evaluating the improper integral from 1 to infinity =1 (unless I blundered somewhere; I did it really quick)
 
  • #3


As a scientist, it is important to be able to prove that a given function is a probability density function. In this case, we are asked to prove that the function 1/x^2 for x>0 is a probability density function.

To prove this, we must show that the function satisfies two conditions:

1. The integral of the function over its entire domain is equal to 1.
2. The function is non-negative for all values of x.

To begin, let us first integrate the function 1/x^2 over its domain of x>0. This can be done by using the power rule of integration, which states that the integral of x^n is equal to x^(n+1)/(n+1). Applying this rule, we get:

∫ 1/x^2 dx = x^(-2+1)/(-2+1) = x^-1/-1 = -1/x

Next, we must evaluate this integral from 0 to infinity, as the function is only defined for x>0. This can be done using the limit as x approaches infinity. Therefore, our integral becomes:

∫ 1/x^2 dx = lim as x→∞ (-1/x) - lim as x→0 (-1/x)

As x approaches infinity, the value of -1/x approaches 0. And as x approaches 0, the value of -1/x approaches negative infinity. Therefore, our integral becomes:

∫ 1/x^2 dx = 0 - (-∞) = ∞

Since our integral is equal to infinity, it does not satisfy the first condition of being equal to 1. However, this does not mean that the function is not a probability density function. To prove this, we must show that the function is non-negative for all values of x.

We can see that for x>0, the function 1/x^2 is always positive. This means that the function is non-negative for all values of x, satisfying the second condition of being a probability density function.

In conclusion, while the integral of the function 1/x^2 over its domain is equal to infinity, the function still satisfies the second condition of being non-negative for all values of x. Therefore, we can conclude that the function 1/x^2 for x>0 is a probability density function.
 

FAQ: Probability Density Function, prove it

What is a Probability Density Function (PDF)?

A Probability Density Function (PDF) is a mathematical function that describes the relative likelihood of a random variable taking on a certain value. It shows the distribution of a continuous random variable and is often represented graphically as a curve.

How is a PDF different from a probability distribution?

A PDF is a continuous function that shows the probability of a random variable taking on a specific value, while a probability distribution is a table, graph, or formula that shows the probabilities of different outcomes. In other words, a PDF represents the continuous probability distribution of a random variable, while a probability distribution represents the discrete probabilities of different outcomes.

How is a PDF used in statistics?

In statistics, a PDF is used to calculate the probability that a random variable falls within a certain range of values. It is also used to calculate other important statistical measures, such as mean, median, and standard deviation.

Can you prove the existence of a PDF?

The existence of a PDF can be proven mathematically using calculus. It is derived from the cumulative distribution function (CDF) by taking the derivative. The CDF represents the probability of a random variable being less than or equal to a specific value, while the PDF represents the probability density at that value.

What are the properties of a PDF?

A PDF has several important properties, including:

  • It is always non-negative, meaning that the probability of a random variable taking on a negative value is zero.
  • The total area under the curve is equal to 1, representing the total probability of all possible outcomes.
  • The height of the curve at any point represents the probability density at that point.

Similar threads

Back
Top