- #36
askmathquestions
- 65
- 6
If vectors were invertible matrices, then taking the inverse would be sufficient for most purposes and I would be more satisfied with that, I realized that specifically because it wouldn't matter "which" identity you ended up with after obtaining ##AA^{-1}##, it would simplify to ##I_1 = I_2## in either of any case.PeroK said:No additional assumptions are required. The identity element must be unique.
Considering the case where we have only the zero matrix and hence no identity is unnecessarily muddying the waters.
Still no one has addressed the ##x## ##y## example I brought up, which seems to suggest vectors are invertible matrices.
Though in practice, it is a better proof if the proof can be extended to non-vectors and non-square matrices too.
There also might be some confusion in all the back and forth between what is ##A## ##I## and ##x##.
In my original problem, I took ##A## to be any ##n## x ##p## matrix that is not identically the zero matrix, so that this would cover both vectors and square matrices. I'm mainly interested in left-multiplication by an identity ##IA = A.##