- #1
DMT
- 9
- 0
I've been having some trouble with outliers messing up my best fit line on my scatter plot in python. I'm using numpy's polyfit function to calculate the slope and y intercept of the best fit line, however I always seem to get one or two points which throw off the slope enough to make quite a noticeable difference. I've already checked a few python references and did a lengthy google search, but haven't found a solution. Does anyone know of a good way to fix this problem without having to limit the interval or physically remove the bad points from my data?
Edit: Also, knowing a way to take errors into account would be very helpful as well.
Thanks!
Edit: Also, knowing a way to take errors into account would be very helpful as well.
Thanks!
Last edited: