? asked in 科學數學 · 1 decade ago

About mean value theorem

Show that if f''(x) > 0 throughout an interval [a,b], then f'(x) has at most one zero in [a,b].

What if f''(x) < 0 throughout [a,b] instead?

Update:

Show that if f ''(x) > 0 throughout an interval [a,b], then f '(x) has at most one zero in [a,b].

What if f ''(x) < 0 throughout [a,b] instead?

原始題目的一些符號擠在一起容易渾淆

這是已經修改過的題目

2 Answers

Rating
  • L
    Lv 7
    1 decade ago
    Favorite Answer

    正如 cgkm 大師所言的 , 你只要假設 f'(x) 有兩個以上零根再用均值定理馬上就可以得到矛盾了 , 寫起來如下 :Assume that f'(x) has at least two zero in [a,b] , then we can choose p and q in [a,b] such that f'(p) = 0 = f'(q).But M.V.T. => 0 = [f'(p) - f'(q)]/(p - q) = f"(c) for some c between p and q (=><=)Hence f'(x) has at most one zero in [a,b]f ''(x) < 0 時也一樣 , 證明同上 .  

    • Login to reply the answers
  • Eric
    Lv 6
    1 decade ago

    提示:Suppose you can find two distinct roots c and d, and use the mean value theorem on g = f'.

    • Login to reply the answers
Still have questions? Get your answers by asking now.