- #1
kyphysics
- 681
- 438
I often get confused by conventions and meanings how we interpret decimals in relation to percent.
First, is 100% = 1 conventionally (or, logically)?
I ask, because I think I've seen people use .001 as meaning 1% and .1 as being 10%. But, that would mean 1 would be 100%.
Likewise, I seen people just use 100%, 10%, and 1%.
Does it really matter what we use, as long as one is consistent in the scale of things?
First, is 100% = 1 conventionally (or, logically)?
I ask, because I think I've seen people use .001 as meaning 1% and .1 as being 10%. But, that would mean 1 would be 100%.
Likewise, I seen people just use 100%, 10%, and 1%.
Does it really matter what we use, as long as one is consistent in the scale of things?