Section Objectives
Let be a two-variable function whose 1st-order partial derivatives exist at . Also let and be increments of and , respectively. The differentials of and are defined by
and the total differential of is defined by
Roughly speaking, a total differential tells us how a infinitesimal change in the dependent variable is related to small changes in the independent variables. In this way, the total differential approximates the increment :
where . To "approximate by differentials" is to use the approximation .
where is the acceleration due to gravity. A pendulum is moved from a location where ft/s to a location where ft\s. There was also a temperature change that resulted in a change in the length of the pendulum from ft to ft. Use differentials to approximate the corresponding change in the pendulum's period.
For functions of a single variable, "having a derivative" and "being differentiable" mean the same thing. For functions of several variables, however, the partial derivative is a weaker concept than the corresponding Calculus I derivative. For example, for functions of a single variable, wherever the derivative exists, the function is automatically continuous. For multi-variable functions, the existence of partial derivatives does not necessarily imply continuity.
Our goal now is to generalize Calculus I differentiability to multi-variable functions so that the new generalized idea retains the "strength" it initially had.
Definition
Let . The function is differentiable at if can be written in the form
where and are functions of , , , and having the the property that as .
According to this definition, to be differentiable means that a function can be well-approximated by differentials and that the approximation gets better and better as and get small. This is completely analogous to the definition of differentiability in Calculus I, but it wasn't presented there in quite the same way. With this definition of differentiability, we can continue to say that differentiability implies continuity: If a function of any number of variables is differentiable at a point, then it is continuous at that point.
You have noticed by now (after only one example!) that the definition of differentiability is rather difficult to work with. Fortunately, we have the following theorem.
Theorem
If is a function such that , , and all exist and are continuous in a neighborhood of , then is differentiable at .
Let's return now to the idea of approximation by differentials:
Solving the approximation for gives
If we substitute for and for , the last expression takes the form
According to this approximation, at an arbitrary point that is close to , we should expect
The right-hand side of this expression is a linear function of the variables and , and this motivates the following definition.
Definition
Suppose the function has contiuous first partial derivatives in a neighborhood of the point . The linearization of at is given by
The approximation is called the standard linear approximation at .
If is a differentiable function, we should expect this approximation to be pretty good near the point . Of course, the farther we get from , the less we should expect from the approximation.
You may have not noticed, but the graph of the linearization is a plane. Take a closer look:
This is a linear equation in the three variables , , and (everything else is a number). Therefore, its graph is a plane in 3-dimensional space. In fact, that plane is the tangent plane. Analogous to a tangent line, the plane is tangent to the graph at the point . We will come back to this idea in section 4.6, but for now, let's do a few examples.
var("x,y")
f=2*x^2-3*x*y+8*y^2+2*x-4*y+4
g=34+13*(x-2)-26*(y+1)
A=plot3d(f,(x,0,4),(y,-3,1),color="lightgray", opacity=0.8)
B=plot3d(g,(x,0,4),(y,-3,1),color="yellow", opacity=0.5)
C=point([(2,-1,34)],size=60, color="black")
show(A+B+C)