Partial Differentiation

Real-world engineering problems rarely involve just one variable. The stress in a beam depends on load, length, and cross-section. The volume of a gas depends on pressure and temperature. To analyze these, we need multivariable calculus.

Functions of Several Variables

Multivariable Function

A function z=f(x,y)z = f(x, y) assigns a unique output zz for every pair (x,y)(x, y) in its domain. The graph of such a function is a surface in 3D space.

Partial Derivatives

When we differentiate a function of multiple variables, we must specify which variable is changing while holding the others constant. This is a partial derivative.

Notation

  • zx\frac{\partial z}{\partial x} or fxf_x: Partial derivative with respect to xx (treat yy as constant).
  • zy\frac{\partial z}{\partial y} or fyf_y: Partial derivative with respect to yy (treat xx as constant).
  • Note the use of the "curly d" symbol (\partial).

Partial Derivatives

Visualizing Surface: z=0.5(x2y2)z = 0.5(x^2 - y^2)
Red Arrow: Slope along x-axis (z/x\partial z / \partial x)
Green Arrow: Slope along y-axis (z/y\partial z / \partial y)

0.50
0.50
z value:0.00
Slope X (z/x\partial z/\partial x):0.50
Slope Y (z/y\partial z/\partial y):-0.50
Drag to Rotate | Scroll to Zoom

Higher-Order Partial Derivatives

Just like single-variable functions, we can take second derivatives.

Second-Order Partial Derivatives

  • fxx=2zx2f_{xx} = \frac{\partial^2 z}{\partial x^2}
  • fyy=2zy2f_{yy} = \frac{\partial^2 z}{\partial y^2}
  • fxy=2zxyf_{xy} = \frac{\partial^2 z}{\partial x \partial y} (Mixed partial: differentiate w.r.t xx then yy)
  • fyx=2zyxf_{yx} = \frac{\partial^2 z}{\partial y \partial x} (Mixed partial: differentiate w.r.t yy then xx)

Clairaut's Theorem

Clairaut's Theorem: If the mixed partial derivatives are continuous, then the order doesn't matter: fxy=fyxf_{xy} = f_{yx}.

Extrema of Functions of Two Variables (Second Partials Test)

To find the relative maxima and minima of a surface z=f(x,y)z = f(x, y), we first find critical points where fx=0f_x = 0 and fy=0f_y = 0. Then, we use the Second Partials Test (involving the Hessian determinant) to classify them.

The Second Partials Test

  1. Find all critical points (a,b)(a, b) such that fx(a,b)=0f_x(a, b) = 0 and fy(a,b)=0f_y(a, b) = 0.
  2. Compute the second partial derivatives: fxxf_{xx}, fyyf_{yy}, and fxyf_{xy}.
  3. Evaluate the discriminant (Hessian determinant) at (a,b)(a, b): D=fxx(a,b)fyy(a,b)[fxy(a,b)]2D = f_{xx}(a, b) f_{yy}(a, b) - [f_{xy}(a, b)]^2
Classify the point based on DD:
  • If D>0D > 0 and fxx(a,b)>0f_{xx}(a, b) > 0, then f(a,b)f(a, b) is a local minimum.
  • If D>0D > 0 and fxx(a,b)<0f_{xx}(a, b) < 0, then f(a,b)f(a, b) is a local maximum.
  • If D<0D < 0, then (a,b)(a, b) is a saddle point (neither max nor min, looks like a horse saddle).
  • If D=0D = 0, the test is inconclusive.

Lagrange Multipliers

Optimization problems in engineering often come with constraints. For instance, maximizing the volume of a box given a fixed amount of surface area material. The method of Lagrange Multipliers solves constrained optimization problems.

Method of Lagrange Multipliers

To maximize or minimize f(x,y,z)f(x, y, z) subject to the constraint g(x,y,z)=cg(x, y, z) = c, we find the points where the gradient of ff is parallel to the gradient of gg. This introduces a scalar parameter λ\lambda (lambda).
f=λg \nabla f = \lambda \nabla g
This expands into a system of equations: fx=λgxf_x = \lambda g_x, fy=λgyf_y = \lambda g_y, fz=λgzf_z = \lambda g_z, along with the original constraint g(x,y,z)=cg(x,y,z) = c. Solving this system yields the constrained critical points.

Chain Rule for Several Variables

If z=f(x,y)z = f(x, y) where xx and yy are themselves functions of another variable tt, then the total derivative of zz with respect to tt is:
dzdt=zxdxdt+zydydt \frac{dz}{dt} = \frac{\partial z}{\partial x}\frac{dx}{dt} + \frac{\partial z}{\partial y}\frac{dy}{dt}

Implicit Partial Differentiation

Often, an equation F(x,y,z)=0F(x, y, z) = 0 defines zz implicitly as a function of xx and yy. Instead of solving for zz, we can find the partial derivatives using the Implicit Function Theorem.

Implicit Function Theorem Formula

  • To find the partial derivative of zz with respect to xx or yy:
zx=Fx(x,y,z)Fz(x,y,z) \frac{\partial z}{\partial x} = -\frac{F_x(x, y, z)}{F_z(x, y, z)}
zy=Fy(x,y,z)Fz(x,y,z) \frac{\partial z}{\partial y} = -\frac{F_y(x, y, z)}{F_z(x, y, z)}
Where FxF_x, FyF_y, and FzF_z are the partial derivatives of the function FF with respect to each variable, assuming Fz0F_z \neq 0.

The Gradient Vector

The gradient of a function f(x,y)f(x, y) is a vector consisting of its partial derivatives. It plays a crucial role in determining the direction of steepest ascent on a surface.

The Gradient

For a function f(x,y)f(x, y), the gradient, denoted by f(x,y)\nabla f(x, y) (read "del f"), is the vector:
f(x,y)=fx(x,y),fy(x,y) \nabla f(x, y) = \left\langle f_x(x, y), f_y(x, y) \right\rangle
Key Properties:
  • The gradient vector points in the direction of the maximum rate of increase of the function.
  • The magnitude of the gradient vector, f|\nabla f|, gives the maximum rate of increase in that direction.
  • The gradient vector is always perpendicular (orthogonal) to the level curves of the function.

Gradient Vector Visualization

Surface: f(x, y) = x² + y²

Current State

Position (x, y)(0.00, 0.00)
∇f = <2x, 2y><0.00, 0.00>
Magnitude |∇f|0.00

Move your mouse over the grid. Notice how the red gradient vector always points directly outward from the origin, perpendicular to the circular level curves. This shows the direction of steepest ascent on the paraboloid surface.

Directional Derivatives

The gradient tells us the rate of change in the directions of the axes and the direction of maximum change. But what if we want to know the rate of change in an arbitrary direction? The directional derivative provides this.

Directional Derivative

The directional derivative of f(x,y)f(x, y) in the direction of a unit vector u=u1,u2\mathbf{u} = \langle u_1, u_2 \rangle is defined as the dot product of the gradient and the unit vector:
Duf(x,y)=f(x,y)u=fx(x,y)u1+fy(x,y)u2 D_{\mathbf{u}}f(x, y) = \nabla f(x, y) \cdot \mathbf{u} = f_x(x, y)u_1 + f_y(x, y)u_2

Total Differentials

The total differential dzdz approximates the total change in zz due to small simultaneous changes in xx (dxdx) and yy (dydy).
dz=zxdx+zydy dz = \frac{\partial z}{\partial x} dx + \frac{\partial z}{\partial y} dy
This is fundamental in error analysis for experiments with multiple measured variables.
Key Takeaways
  • Partial Derivatives measure the rate of change with respect to one variable while holding others constant. Geometrically, they represent the slopes of tangent lines in the x and y directions.
  • The Second Partials Test classifies critical points on surfaces as local max, min, or saddle points using the Hessian discriminant.
  • Lagrange Multipliers optimize multivariable functions subject to constraint equations.
  • The Chain Rule for multivariable functions sums the contributions from each intermediate variable.
  • The Gradient Vector points in the direction of steepest ascent and its magnitude gives the maximum rate of change. It is orthogonal to level curves.
  • Total Differentials approximate changes in multivariable functions, useful for total error estimation.