Wednesday, November 15, 2017

definition - Not defining the imaginary number i as the principal square root of 1.




Background



I learned early on that it's important that we define the imaginary number i such that i2=1, rather than i=1.



Question



I can't fully remember the reasoning for this important note, so I was wondering if someone could elaborate?



Own efforts




Any explanation I find eventually boils down to the same argument.




If we define i as the principal square root of 1, then we get



1=i2=11fallacy=(1)(1)=1=1




But to me, this seems like wrongful use of the ab=ab rule, since this rule comes with certain restrictions on a,b. So I don't see how this is a misuse of the definition of i.




Are there other reasons why we should be careful not to define i as the principal square root of 1?


Answer



If you define i as 1 then there is an obvious question: how do you know that 1 has some square root? Besides, writing i=1 seems to imply that i is the square root of 1. But, in C, 1 has two square roots: ±i. Assuming that i is the square root of 1 leads to fallacies, such as the one the you mentioned.


No comments:

Post a Comment

analysis - Injection, making bijection

I have injection f:AB and I want to get bijection. Can I just resting codomain to f(A)? I know that every function i...