Partial Differentiation
Real-world engineering problems rarely involve just one variable. The stress in a beam depends on load, length, and cross-section. The volume of a gas depends on pressure and temperature. To analyze these, we need multivariable calculus.
Functions of Several Variables
Multivariable Function
A function assigns a unique output for every pair in its domain. The graph of such a function is a surface in 3D space.
Partial Derivatives
When we differentiate a function of multiple variables, we must specify which variable is changing while holding the others constant. This is a partial derivative.
Notation
- or : Partial derivative with respect to (treat as constant).
- or : Partial derivative with respect to (treat as constant).
- Note the use of the "curly d" symbol ().
Partial Derivatives
Visualizing Surface:
Red Arrow: Slope along x-axis ()
Green Arrow: Slope along y-axis ()
0.50
0.50
z value:0.00
Slope X ():0.50
Slope Y ():-0.50
Drag to Rotate | Scroll to Zoom
Higher-Order Partial Derivatives
Just like single-variable functions, we can take second derivatives.
Second-Order Partial Derivatives
- (Mixed partial: differentiate w.r.t then )
- (Mixed partial: differentiate w.r.t then )
Clairaut's Theorem
Clairaut's Theorem: If the mixed partial derivatives are continuous, then the order doesn't matter: .
Extrema of Functions of Two Variables (Second Partials Test)
To find the relative maxima and minima of a surface , we first find critical points where and . Then, we use the Second Partials Test (involving the Hessian determinant) to classify them.
The Second Partials Test
- Find all critical points such that and .
- Compute the second partial derivatives: , , and .
- Evaluate the discriminant (Hessian determinant) at :
Classify the point based on :
- If and , then is a local minimum.
- If and , then is a local maximum.
- If , then is a saddle point (neither max nor min, looks like a horse saddle).
- If , the test is inconclusive.
Lagrange Multipliers
Optimization problems in engineering often come with constraints. For instance, maximizing the volume of a box given a fixed amount of surface area material. The method of Lagrange Multipliers solves constrained optimization problems.
Method of Lagrange Multipliers
To maximize or minimize subject to the constraint , we find the points where the gradient of is parallel to the gradient of . This introduces a scalar parameter (lambda).
This expands into a system of equations: , , , along with the original constraint . Solving this system yields the constrained critical points.
Chain Rule for Several Variables
If where and are themselves functions of another variable , then the total derivative of with respect to is:
Implicit Partial Differentiation
Often, an equation defines implicitly as a function of and . Instead of solving for , we can find the partial derivatives using the Implicit Function Theorem.
Implicit Function Theorem Formula
- To find the partial derivative of with respect to or :
Where , , and are the partial derivatives of the function with respect to each variable, assuming .
The Gradient Vector
The gradient of a function is a vector consisting of its partial derivatives. It plays a crucial role in determining the direction of steepest ascent on a surface.
The Gradient
For a function , the gradient, denoted by (read "del f"), is the vector:
Key Properties:
- The gradient vector points in the direction of the maximum rate of increase of the function.
- The magnitude of the gradient vector, , gives the maximum rate of increase in that direction.
- The gradient vector is always perpendicular (orthogonal) to the level curves of the function.
Gradient Vector Visualization
Surface: f(x, y) = x² + y²
Current State
Position (x, y)(0.00, 0.00)
∇f = <2x, 2y><0.00, 0.00>
Magnitude |∇f|0.00
Move your mouse over the grid. Notice how the red gradient vector always points directly outward from the origin, perpendicular to the circular level curves. This shows the direction of steepest ascent on the paraboloid surface.
Directional Derivatives
The gradient tells us the rate of change in the directions of the axes and the direction of maximum change. But what if we want to know the rate of change in an arbitrary direction? The directional derivative provides this.
Directional Derivative
The directional derivative of in the direction of a unit vector is defined as the dot product of the gradient and the unit vector:
Total Differentials
The total differential approximates the total change in due to small simultaneous changes in () and ().
This is fundamental in error analysis for experiments with multiple measured variables.
Key Takeaways
- Partial Derivatives measure the rate of change with respect to one variable while holding others constant. Geometrically, they represent the slopes of tangent lines in the x and y directions.
- The Second Partials Test classifies critical points on surfaces as local max, min, or saddle points using the Hessian discriminant.
- Lagrange Multipliers optimize multivariable functions subject to constraint equations.
- The Chain Rule for multivariable functions sums the contributions from each intermediate variable.
- The Gradient Vector points in the direction of steepest ascent and its magnitude gives the maximum rate of change. It is orthogonal to level curves.
- Total Differentials approximate changes in multivariable functions, useful for total error estimation.