- #1
Aston08
- 22
- 0
Currently utilizing very simple logic in determining the directional trend of a signal line, and was hoping someone might be able to offer a suggestion as to a more effective method of filtering false signals.
As it stands the logic being used for determining the direction of a trend is if the prior data point's (generally 1-2) value are less than the current bar it signals an up trend and vice versus. The issue I am having is the sensitivity of the logic is such that tiny spikes aren't being completely smoothed out by the moving average.
Any suggestions on what might be a more effective way to qualify the variation's likelihood for divergance? Possibly a minimum threshold for slope or percentage change?
The areas of issue are those circled in red ... the sharp transitions like the type highlighted by the blue arrow are more valid.
I would greatly appreciate any suggestions
As it stands the logic being used for determining the direction of a trend is if the prior data point's (generally 1-2) value are less than the current bar it signals an up trend and vice versus. The issue I am having is the sensitivity of the logic is such that tiny spikes aren't being completely smoothed out by the moving average.
Any suggestions on what might be a more effective way to qualify the variation's likelihood for divergance? Possibly a minimum threshold for slope or percentage change?
The areas of issue are those circled in red ... the sharp transitions like the type highlighted by the blue arrow are more valid.
I would greatly appreciate any suggestions