Вы находитесь на странице: 1из 6

Welcome to calculus.

I'm Professor Ghrist.


We're about to begin lecture ten on
derivatives.
We're about to begin chapter two, on
differentiation.
Now, you may know how to compute some
derivatives, but do you know what they
really mean?
In this lesson, we'll distinguish between
the interpretation and the definition of
a derivative and by the end of the
lesson, you'll be seeing derivatives
everywhere.
In your introduction to Calculus, you
probably learned derivatives as slopes.
Well, you may be surprised, that we've
not talked very much about slopes in this
course, thus far, and that's because
slope, is an interpretation of the
derivative and not.
What it really means.
It is not the definition.
In fact, it's a pretty poor
interpretation.
For example, when you go to multi
variable calculus, you're going to look
at functions with multiple inputs, and
multiple outputs.
What does a derivative mean for that?
Let's say you have a vector field.
Or you're looking at a digital image that
consists of many, many variables.
Or worse still, you're looking at
functions that aren't smooth enough to
have a well defined slope.
In all of these contexts, derivatives
make sense even when slopes do not.
We'll consider three different
definitions of the derivative.
The derivative of f at x equals a is
first the limit as x goes to a of f of x
minus f of a over x minus a.
This might not be the definition that you
recall seeing.
But it's a great definition because you
can interpret it, as the limit of the
change in the output over the change in
the input.
As the input tends to a.
This definition, among all of the
definitions, is to me conceptually the
clearest.
You have an input, and an output, and as
you change that input at a certain rate,
the derivative tells you at what rate the
output is changing Our second definition
should look familiar.
The derivative of f at a is the limit as

h goes to 0, of f of a plus h minus f of


a, over h.
This two has the interpretation as the
limit in the change in output over the
change in input as the change in input,
h, goes to 0.
Of course, this is really the same
definition, we've simply performed a
change of variables, setting h to be
equal to x minus a, the change in the
output.
If you substitute x minus a in for h
above, writing x as a plus h, then we'll
see.
that this is really the same definition.
Our third definition is a bit different
and it's in terms of the first order
variation of the output.
What do I mean by that?
Well consider your function evaluated not
at a but at a plus h What is that equal
too?
Well, it's f of a plus some perturbation
term, some variation term that depends on
h.
We say, that the derivative of f at a, is
the constant C, satisfying.
F at a plus h equals f at a plus C times
h plus other stuff that is of higher
order in h, that is, big O of h squared.
This constant in front of the first order
term is...
What we can define the derivative at a to
b, now this is not the definition that
you are used to seeing.
In fact there are some problems with this
definition.
Some people call this a strong derivative
because sometimes It does not exist, even
when the true derivative in terms of a
limit does.
For purposes of this course, we're not
going to worry about that distinction so
much.
Thinking in terms of a Taylor expansion,
and using the language of big O to
control Higher-order terms is going to
wind up being very illuminative.
Let's compare these different definitions
in the context of a simple example.
The simplest one I can think of is f of x
equals x to the n for some positive
integer n.
In this case, the first 2 definitions of
the derivative of f at a are of the form
limit as h goes to 0.
F at a plus h minus f at a, all over h.
Knowing that f is really raising to the
nth power.
allows us to simplify a bit.

What happens when we take a plus h and


raise it to some positive integer?
Well the first term is going to be a to
the n.
The next term is going to be n times a to
the n minus one times h.
This comes from the binomial theorum, or
multiplying out as a long polynomial.
What about all of the other terms?
I may not have room on the page to write
them all, but I know that all of them
have powers of h that are at least
quadratic I'm going to compress all of
that together and call it big o of h
squared, which is exactly what that is.
And now we see a perhaps familiar
computation coming about: the a to the n
terms cancel And everything that is left
has some power of h.
We can factor that, cancel with the
denominator, and we obtain the limit as h
goes to 0 of n times a to the n minus 1
plus some terms that are in big O of h.
so that as h goes to zero they vanish and
we are left with the familiar
computation: the derivative of x to the n
at a.
Now it's worth observing that really That
definition of the derivative is the same
as the third definition, in terms of the
variation, the output.
If we take our function F and evaluate it
at A plus H, what do we get?
A plus H to the N.
This has a constant term, A to the N, a
first order, N times A to the N minus one
times H, plus higher order terms in H.
That coefficient, the first order term,
is the derivative at A.
a few other examples will help illustrate
this.
Let's look at some familiar functions, e
to the x, cosine of x, and square root of
x.
First of all, for e to the x let's
compute the first order variation By
evaluating it at X plus H.
Now we know that E to the X plush H is E
to X times E to the H.
And although this is a little bit of
circular reasoning, we know what E to the
H means.
And that's really one plus h plus terms
of higher order in h.
So if we expand this out, we see that the
constant term is e to the x.
The first order term in h is either the x
times h.
All other terms Are higher order in h,
and from this simple computation we can

see that the derivative of e to the x is


e to the x, it's the coeffiecent in front
of the first order term in h.
Likewise for cosine, What happens when we
look at cosine of x plus h.
Well we can use the summation formula for
cosines and expand this as cosine of x
time cosine of h, minus sin of x, times
sin of h And again, using what we know
about cosine and sine, we can expand the
terms in h and say that cosine of h is 1
plus big O of h squared.
And sine of h is h plus big O of h cubed.
Now if we take these terms and rearrange
them a little bit.
We get a zeroth order term of cosine of
x, as we must.
The first order term has, as its
coefficient in front of h, negative sine
of x.
The derivative, cosine of x for the
square root, we have to simplify.
A square root, of x plus h.
How do we do that?
Well, if we factor out a square root of
x, what is left over is 1 plus h over x,
all to the power, 1 half.
Or the square root of it if you like, I
write it in this form so that we see the
binomial series come into play, and using
that binomial series gives us an
expansion of 1 plus 1 half times h over x
plus something in big O of h over x
quantity squared.
Now if we expand that out, we see, that
the zeroth order term in h, is of course,
square root of x.
The first order term, simplifies 2 1
half, x to the minus 1 half, times h.
And from that, we see the derivative.
Now notice that all of the higher order
terms involve a square root of X and
something in big O of h squared over x
squared.
If we want to ignore that and call that
big O of h squared, we'd better make sure
that x is positive.
When x is 0, we have a problem.
The derivative does not exist.
But s long as x is bound way from zero we
have a well defined expansion pin terms
of h and we can read off the derivative
as one over two square root of x.
There's a lot of notation associated with
derivatives.
The derivative of the function y equals f
of x, can be denoted in the following
ways.
Some of the best notation is df, dx, or
dy, dx.

But there's some other notation that you


might see or use from time to time,
including f prime or y prime or y dot.
Now why the difference between these?
Well, the best notations are those that
tell you.
Exactly which variables are changing, dy
dx means the rate of change of y with
respect to change in x.
On the other hand, f prime or y dot can
be a bit ambiguous.
The dot in particular connotes change
with respect to time.
Now sometimes you'll see a differential
notation of the form df.
This can be helpful, but again remember
which variables are changing.
The one thing that you must never do is
write something foolish like canceling
the ds to get df dx equals f over x.
No way.
Don't get creative with your handwriting
on these.
Don't use scripty cursive.
D, don't use Greek Delta, don't do
anything like that because those symbols
have meanings of which you have not yet
learned.
Don't ever do any of this.
Stick to the standard notation, please.
Now, the most common examples of
derivatives wind up involving rates of
change with respect to time.
For example in your introductory to
physics, you've certainly seen velocity
or acceleration as a derivative.
These are derivatives of position with
respect to time.
But there are other examples as well.
If you study current, current is related
to a rate of change of charge with
respect to time flowing through a wire.
In chemistry, if you look at reaction
rates, in a chemical reaction, these are
defined as the derivative, of say, a
concentration of a product or a reactant,
with respect to time.
There are many other examples of
derivatives that are not necessarily
rates of change with respect to time.
If you look at a spring, one can define
the spring constant as the rate of change
of the force applied.
With respect to the deflection, the
elastic modulus of a stretchy material,
is a rate of change, of stress with
respect to strain.If you want to know
what viscosity is or how slippery a fluid
is.
Well, this is defined in terms of

quantities like sheer stress and the rate


of change of the velocity of a sliding
fluid with respect to height.
And finally, economics consists of all
manner of interesting derivatives.
For example, if you look at marginal tax
rates, these are defined as the rate of
change of the amount of tax collected,
with respect to change in income.
All of these are wonderful examples of
derivatives.
There's so much more to derivatives than
slope.
Look around you.
Do you see something that changes.
That is a derivative.
Derivatives are ubiquitous and
understanding their proper definiation
helps us to interpret, find and then use
derivatives.
In our next lesson, we'll consider how to
quickly compute derivatives.
[BLANK_AUDIO]

Вам также может понравиться