- #1
torquerotates
- 207
- 0
This is a really basic question that I don't know why I'm not getting. So from my understanding, a=b is defined as a=>b and b=>a.
So say a^2=c and a=c^(1/2)
so which implies which?
Say I start from a=c^(1/2)
I square both sides and I get a^2=c. So a=c^(1/2) => a^2=c.
But if I start from a^2=c, by taking the square root of both sides, I get a=c^(1/2) & -c^(1/2)
But since I got a=c^(1/2) among one of them, I have shown that a=c^(1/2) is still implied by a^2=c.
So a^2=c <=> a=c^(1/2)
Which is clearly not true. they are not the same b/c by taking the square root of both sides we get a=+ or -c^(1/2). Is there something wrong with my reasoning?
So say a^2=c and a=c^(1/2)
so which implies which?
Say I start from a=c^(1/2)
I square both sides and I get a^2=c. So a=c^(1/2) => a^2=c.
But if I start from a^2=c, by taking the square root of both sides, I get a=c^(1/2) & -c^(1/2)
But since I got a=c^(1/2) among one of them, I have shown that a=c^(1/2) is still implied by a^2=c.
So a^2=c <=> a=c^(1/2)
Which is clearly not true. they are not the same b/c by taking the square root of both sides we get a=+ or -c^(1/2). Is there something wrong with my reasoning?