- #1
Andres kubliki
- 1
- 0
Does somebody knows why the output changes with the change of the sensitivity?
When I'm measuring noise, let say I have a value of 5nV/sqrt(Hz) at 10 nV sensitivity. When I switch to 20 nV, the output goes to 10nV/sqrt(Hz). Why?
Which on is them the correct?
When I'm measuring noise, let say I have a value of 5nV/sqrt(Hz) at 10 nV sensitivity. When I switch to 20 nV, the output goes to 10nV/sqrt(Hz). Why?
Which on is them the correct?