- #1
andrewcheong
- 9
- 0
Hello, all. I know what I want, but I just don't know what it's called.
This has to do with regression (polynomial fits). Given a set of N (x,y) points, we can compute a regression of degree K. For example, we could have a hundred (x,y) points and compute a linear regression (degree 1). Of course, there would be residual error because the line-of-best-fit won't go through every point perfectly. We could also compute quadratic (degree 2) or higher-degree regressions. This should reduce the residual error, or at least, be no worse an estimate than the lower-degree regressions.
Now, what I want is a regression that determines the "best degree". I mean, if I have N points, I can always get a perfect fit by computing a regression of degree N-1. For example, if I only have 2 points, a 1-degree regression (linear) can fit both points perfectly. If I only have 3 points, a 2-degree regression (quadratic) can fit all three points perfectly, etc. So if I have a 100 points, one might say that a 99-degree regression is the "best degree". However, I look at higher-degrees as a cost.
I want a method of determining a regression with a balance between low residual errors and low degree. I imagine that there must be some sort of a "cost" parameter that I have to set, because the computer alone cannot say what the "right" balance between residual error and degree is.
Can anyone point me to the name of such a technique? Perhaps the most common used form of it?
I want to apply this to stock market prices. As human beings, we can look at a plot of stock prices and mentally "fit" a smooth curve across the points that makes sense. But how does a computer do this? We can't just tell it to do a perfect fit, because then it'll do an N-1 degree fit (e.g. cubic B-splines).
Thanks in advance!
This has to do with regression (polynomial fits). Given a set of N (x,y) points, we can compute a regression of degree K. For example, we could have a hundred (x,y) points and compute a linear regression (degree 1). Of course, there would be residual error because the line-of-best-fit won't go through every point perfectly. We could also compute quadratic (degree 2) or higher-degree regressions. This should reduce the residual error, or at least, be no worse an estimate than the lower-degree regressions.
Now, what I want is a regression that determines the "best degree". I mean, if I have N points, I can always get a perfect fit by computing a regression of degree N-1. For example, if I only have 2 points, a 1-degree regression (linear) can fit both points perfectly. If I only have 3 points, a 2-degree regression (quadratic) can fit all three points perfectly, etc. So if I have a 100 points, one might say that a 99-degree regression is the "best degree". However, I look at higher-degrees as a cost.
I want a method of determining a regression with a balance between low residual errors and low degree. I imagine that there must be some sort of a "cost" parameter that I have to set, because the computer alone cannot say what the "right" balance between residual error and degree is.
Can anyone point me to the name of such a technique? Perhaps the most common used form of it?
I want to apply this to stock market prices. As human beings, we can look at a plot of stock prices and mentally "fit" a smooth curve across the points that makes sense. But how does a computer do this? We can't just tell it to do a perfect fit, because then it'll do an N-1 degree fit (e.g. cubic B-splines).
Thanks in advance!