Part of a series of articles about
Calculus
Definitions
Integration by
In calculus , the sum rule in differentiation is a method of finding the derivative of a function that is the sum of two other functions for which derivatives exist. This is a part of the linearity of differentiation . The sum rule in integration follows from it. The rule itself is a direct consequence of differentiation from first principles .
The sum rule tells us that for two functions u and v :
d
d
x
(
u
+
v
)
=
d
u
d
x
+
d
v
d
x
{\displaystyle {\frac {d}{dx}}(u+v)={\frac {du}{dx}}+{\frac {dv}{dx}}}
This rule also applies to subtraction and to additions and subtractions of more than two functions
d
d
x
(
u
+
v
+
w
+
…
)
=
d
u
d
x
+
d
v
d
x
+
d
w
d
x
+
⋯
{\displaystyle {\frac {d}{dx}}(u+v+w+\dots )={\frac {du}{dx}}+{\frac {dv}{dx}}+{\frac {dw}{dx}}+\cdots }
Simple Proof [ edit ]
Let h(x) = f(x) + g(x) , and suppose that f and g are each differentiable at x . We want to prove that h is differentiable at x and that its derivative h'(x) is given by f'(x)+g'(x) .
h
′
(
x
)
=
lim
Δ
x
→
0
h
(
x
+
Δ
x
)
−
h
(
x
)
Δ
x
{\displaystyle h'(x)=\lim _{\Delta {x}\to 0}{\frac {h(x+\Delta {x})-h(x)}{\Delta {x}}}}
=
lim
Δ
x
→
0
[
f
(
x
+
Δ
x
)
+
g
(
x
+
Δ
x
)
]
−
[
f
(
x
)
+
g
(
x
)
]
Δ
x
{\displaystyle =\lim _{\Delta {x}\to 0}{\frac {[f(x+\Delta {x})+g(x+\Delta {x})]-[f(x)+g(x)]}{\Delta {x}}}}
=
lim
Δ
x
→
0
f
(
x
+
Δ
x
)
−
f
(
x
)
+
g
(
x
+
Δ
x
)
−
g
(
x
)
Δ
x
{\displaystyle =\lim _{\Delta {x}\to 0}{\frac {f(x+\Delta {x})-f(x)+g(x+\Delta {x})-g(x)}{\Delta {x}}}}
=
lim
Δ
x
→
0
f
(
x
+
Δ
x
)
−
f
(
x
)
Δ
x
+
lim
Δ
x
→
0
g
(
x
+
Δ
x
)
−
g
(
x
)
Δ
x
{\displaystyle =\lim _{\Delta {x}\to 0}{\frac {f(x+\Delta {x})-f(x)}{\Delta {x}}}+\lim _{\Delta {x}\to 0}{\frac {g(x+\Delta {x})-g(x)}{\Delta {x}}}}
=
f
′
(
x
)
+
g
′
(
x
)
{\displaystyle =f'(x)+g'(x)}
.
More Complicated Proof [ edit ]
Let y be a function given by the sum of two functions u and v , such that:
y
=
u
+
v
{\displaystyle y=u+v\,}
Now let y , u and v be increased by small increases Δy , Δu and Δv respectively. Hence:
y
+
Δ
y
=
(
u
+
Δ
u
)
+
(
v
+
Δ
v
)
=
u
+
v
+
Δ
u
+
Δ
v
=
y
+
Δ
u
+
Δ
v
.
{\displaystyle y+\Delta {y}=(u+\Delta {u})+(v+\Delta {v})=u+v+\Delta {u}+\Delta {v}=y+\Delta {u}+\Delta {v}.\,}
So:
Δ
y
=
Δ
u
+
Δ
v
.
{\displaystyle \Delta {y}=\Delta {u}+\Delta {v}.\,}
Now divide throughout by Δx :
Δ
y
Δ
x
=
Δ
u
Δ
x
+
Δ
v
Δ
x
.
{\displaystyle {\frac {\Delta {y}}{\Delta {x}}}={\frac {\Delta {u}}{\Delta {x}}}+{\frac {\Delta {v}}{\Delta {x}}}.}
Let Δx tend to 0:
d
y
d
x
=
d
u
d
x
+
d
v
d
x
.
{\displaystyle {\frac {dy}{dx}}={\frac {du}{dx}}+{\frac {dv}{dx}}.}
Now recall that y = u + v , giving the sum rule in differentiation:
d
d
x
(
u
+
v
)
=
d
u
d
x
+
d
v
d
x
.
{\displaystyle {\frac {d}{dx}}\left(u+v\right)={\frac {du}{dx}}+{\frac {dv}{dx}}.}
The rule can be extended to subtraction , as follows:
d
d
x
(
u
−
v
)
=
d
d
x
(
u
+
(
−
v
)
)
=
d
u
d
x
+
d
d
x
(
−
v
)
.
{\displaystyle {\frac {d}{dx}}\left(u-v\right)={\frac {d}{dx}}\left(u+(-v)\right)={\frac {du}{dx}}+{\frac {d}{dx}}\left(-v\right).}
Now use the special case of the constant factor rule in differentiation with k =−1 to obtain:
d
d
x
(
u
−
v
)
=
d
u
d
x
+
(
−
d
v
d
x
)
=
d
u
d
x
−
d
v
d
x
.
{\displaystyle {\frac {d}{dx}}\left(u-v\right)={\frac {du}{dx}}+\left(-{\frac {dv}{dx}}\right)={\frac {du}{dx}}-{\frac {dv}{dx}}.}
Therefore, the sum rule can be extended so it "accepts" addition and subtraction as follows:
d
d
x
(
u
±
v
)
=
d
u
d
x
±
d
v
d
x
.
{\displaystyle {\frac {d}{dx}}\left(u\pm v\right)={\frac {du}{dx}}\pm {\frac {dv}{dx}}.}
The sum rule in differentiation can be used as part of the derivation for both the sum rule in integration and linearity of differentiation .
Generalization to finite sums [ edit ]
Consider a set of functions f 1 , f 2 ,..., f n . Then
d
d
x
(
∑
i
=
1
n
f
i
(
x
)
)
=
d
d
x
(
f
1
(
x
)
+
f
2
(
x
)
+
⋯
+
f
n
(
x
)
)
=
d
d
x
f
1
(
x
)
+
d
d
x
f
2
(
x
)
+
⋯
+
d
d
x
f
n
(
x
)
{\displaystyle {\frac {d}{dx}}\left(\sum _{i=1}^{n}f_{i}(x)\right)={\frac {d}{dx}}\left(f_{1}(x)+f_{2}(x)+\cdots +f_{n}(x)\right)={\frac {d}{dx}}f_{1}(x)+{\frac {d}{dx}}f_{2}(x)+\cdots +{\frac {d}{dx}}f_{n}(x)}
so
d
d
x
(
∑
i
=
1
n
f
i
(
x
)
)
=
∑
i
=
1
n
(
d
d
x
f
i
(
x
)
)
.
{\displaystyle {\frac {d}{dx}}\left(\sum _{i=1}^{n}f_{i}(x)\right)=\sum _{i=1}^{n}\left({\frac {d}{dx}}f_{i}(x)\right).}
In other words, the derivative of any finite sum of functions is the sum of the derivatives of those functions.
This follows easily by induction; we have just proven this to be true for n = 2. Assume it is true for all n < k , then define
g
(
x
)
=
∑
i
=
1
k
−
1
f
i
(
x
)
.
{\displaystyle g(x)=\sum _{i=1}^{k-1}f_{i}(x).}
Then
d
d
x
(
∑
i
=
1
k
f
i
(
x
)
)
=
d
d
x
g
(
x
)
+
d
d
x
f
k
(
x
)
.
{\displaystyle {\frac {d}{dx}}\left(\sum _{i=1}^{k}f_{i}(x)\right)={\frac {d}{dx}}g(x)+{\frac {d}{dx}}f_{k}(x).}
By the inductive hypothesis,
d
d
x
g
(
x
)
=
d
d
x
(
∑
i
=
1
k
−
1
f
i
(
x
)
)
=
∑
i
=
1
k
−
1
d
d
x
f
i
(
x
)
{\displaystyle {\frac {d}{dx}}g(x)={\frac {d}{dx}}\left(\sum _{i=1}^{k-1}f_{i}(x)\right)=\sum _{i=1}^{k-1}{\frac {d}{dx}}f_{i}(x)}
so
d
d
x
(
∑
i
=
1
k
f
i
(
x
)
)
=
∑
i
=
1
k
−
1
d
d
x
f
i
(
x
)
+
d
d
x
f
k
(
x
)
=
∑
i
=
1
k
d
d
x
f
i
(
x
)
{\displaystyle {\frac {d}{dx}}\left(\sum _{i=1}^{k}f_{i}(x)\right)=\sum _{i=1}^{k-1}{\frac {d}{dx}}f_{i}(x)+{\frac {d}{dx}}f_{k}(x)=\sum _{i=1}^{k}{\frac {d}{dx}}f_{i}(x)}
which ends the proof of the sum rule of differentiation.
Note this does not automatically extend to infinite sums. An intuitive reason for why things can go wrong is that there is more than one limit involved (specifically, one for the sum and one in the definition of the derivative ). Uniform convergence deals with these sorts of issues.
References [ edit ]