Artificial Intelligence 🤖
Partial differentiation
Differentials

Differentials

Given a single-variable function f(x)f(x), we call dfd f and dxd x the differentials. Using a Taylor series expansion, consider a small variation, Δx\Delta x, of the argument of ff such that

f(x+Δx)=f(x)+f(x)Δx+O((Δx)2).f(x+\Delta x)=f(x)+f^{\prime}(x) \Delta x+\mathcal{O}\left((\Delta x)^{2}\right) .

Using

Δf(x)=f(x+Δx)f(x)\Delta f(x)=f(x+\Delta x)-f(x)

This gives

Δf(x)=f(x)Δx+O((Δx)2).\Delta f(x)=f^{\prime}(x) \Delta x+\mathcal{O}\left((\Delta x)^{2}\right) .

In infinitesimal form, as Δx0\Delta x \rightarrow 0, we have the differential

df=f(x)dx.d f=f^{\prime}(x) d x .

Now, for two-variable functions, f(x,y)f(x, y), we have

f(x+Δx,y+Δy)=f(x,y)+fx(x,y)Δx++fy(x,y)Δy+ H.O.T ,f(x+\Delta x, y+\Delta y)=f(x, y)+f_{x}(x, y) \Delta x++f_{y}(x, y) \Delta y+\text { H.O.T },

where, again, the subscripts denote partial differentiation. This equation can be expressed as

Δf(x,y)=fx(x,y)Δx+fy(x,y)Δy+H.O.T\Delta f(x, y)=f_{x}(x, y) \Delta x+f_{y}(x, y) \Delta y+H . O . T

which, in infinitesimal form is

df=fxdx+fydyd f=f_{x} d x+f_{y} d y

where dfd f is the differential of the function ff. The above formula is useful in solving exact first-order ODEs

Finally, note that the formula for Δf\Delta f is a useful approximation; for instance, suppose we wanted to estimate the value of (4.98)2(4.07)2\sqrt{(4.98)^{2}-(4.07)^{2}} without using a calculator. The number is obviously close to 5242=3\sqrt{5^{2}-4^{2}}=3. We consider therefore a function f(x,y)=x2y2f(x, y)=\sqrt{x^{2}-y^{2}} and use the eqn with Δx=4.985=0.02\Delta x=4.98-5=-0.02 and Δy=4.074=0.07\Delta y=4.07-4=0.07. The partial derivatives fxf_{x} and fyf_{y} are given as

fx=xx2y2,fy=yx2y2,f_{x}=\frac{x}{\sqrt{x^{2}-y^{2}}}, \quad f_{y}=-\frac{y}{\sqrt{x^{2}-y^{2}}},

which, when evaluated at (5,4)(5,4) give fx(5,4)=5/3f_{x}(5,4)=5 / 3 and fy(5,4)=4/3f_{y}(5,4)=-4 / 3. The equation therefore gives,

or

Δf(5,4)=53(0.02)43(0.07)0.13\Delta f(5,4)=\frac{5}{3}(-0.02)-\frac{4}{3}(0.07) \approx-0.13 (4.98)2(4.07)22.87\sqrt{(4.98)^{2}-(4.07)^{2}} \approx 2.87

Stationary points

Finally, we briefly note that the notion of stationary points extends to multivariable functions as well. Recall that in the single-variable case, a point x=ax=a is a stationary point of f(x)f(x) if f(a)=0f^{\prime}(a)=0. For a function of two variables f(x,y)f(x, y), the point (a,b)(a, b) is a stationary point of f(x,y)f(x, y) if both partial derivatives vanish at (a,b)(a, b) i.e.

fx(a,b)=0, and fy(a,b)=0.f_{x}(a, b)=0, \quad \text { and } f_{y}(a, b)=0 .

Multivariable functions are discussed in a lot more detail in multivariable functions