Question 1189353: supose that f(x) has derivative on the interval (a, b) is continuous on interval [a, b] and f(a) =f(b) then at some value c€(a, b), f'(c) =[f(b) -f(a) ]/(b-a)
1.mean value theorem
2.rolle's theorem
3.newton's theorem
4.Taylor theorem
Answer by ikleyn(52747) (Show Source):
You can put this solution on YOUR website! .
Strictly saying, what is written in the post, is no one of the listed theorem.
It is the soup, the mixture of two theorems (1) and (2),
presented in the form which is never used in Calculus.
Rolle's theorem states that if a function f is continuous on the closed interval [a, b] and differentiable
on the open interval (a, b) such that f(a) = f(b), then f′(x) = 0 for some x with a ≤ x ≤ b.
The Mean Value Theorem states that if a function f is continuous on the closed interval [a,b] and differentiable
on the open interval (a,b), then there exists a point c in the interval (a,b) such that f'(c) is equal to the
function's average rate of change over [a,b].
To make my point even more clear, consider making a salad. You finaly crumble the cabbage and the carrot
(pre-washed and cleaned). Then you mix it. Then you ask a student if it is a cabbage OR a carrot.
To make it even more clear, consider making another salad. You slice cucumbers and tomato, and mix it.
Then you ask a student if this mixture is a cucumber OR a tomato.
By the way, each salad is fantastically tasty . . . I recommend it to anyone . . . for free, i.e. with no fees.
|
|
|