- #1

- 19

- 0

So i have a problem understanding conflicting results of a derivative,

Consider the derivative of x

^{2}, which is 2x.

However, if x

^{2}is expressed as a sum of x's such that f(x) = x + x + x + x .... (x times), the derivative of f(x) becomes = 1 + 1 + 1 + 1 .... (x times.) = x Hence the derivation shows the derivative of x

^{2}to be x.

Clearly this cant be correct. Where is the fallacy in this?

My idea is that the summation is linear in X whilst x

^{2}is non linear hence the summation wont converge to x

^{2}. However this is only an idea?