Analysis: Prove that K is compact if and only if every sequence in K has a subsequence that converges to?

a point in K

2 Answers

Relevance
  • 9 years ago
    Favorite Answer

    This only true if K is also a metric space. Otherwise there are counterexamples to both directions ( see link).

    In a metric space, however, we can use the fact that a point p is a limit point of an infinite set A if and only if every open set containing p also contains infinitely many points of A.

    Now to prove the claim:

    Forward direction: Suppose K is compact and A = (x_n) is a sequence in K. We show, by contradiction, that K contains a limit point of A. So suppose not. Then for every point x in K, there is an open set V_x, containing x, that contains only finitely many points in A. Then {V_x : x in K} is clearly an open cover of K, so there is a finite subcover {V_(x_1), V_(x_2), ... , V_(x_n)}. But each V_(x_i) contains only finitely many points of A, so their union contains only finitely many points of A. But this is a contradiction since {V_(x_i)} covers K, and A is an infinite subset of K.

    Therefore K contains a limit point x of A. If B_n is the open ball of radius 1/n centered at x. Then B_n intersects A at a point y_n. It is easy to check that (y_n) is a subsequence of A converging to x.

    Reverse direction: Suppose every sequence in K has a subsequence that converges to a point in K. Let {O_i : i > 0} be an open cover of K. We want to show that it has a finite subcover.

    For n > 0, let V_n be the union of all O_i for 0 < i <= n. Let K_n = K\(V_n).

    Suppose, for a contradiction, that K_n is nonempty for all n. Then let x_n be an element of K_n. Then (x_n) is a sequence in K, so it has a subsequence (y_i) converging to a point y in K.

    Fix m > 0. We want to show that y is not in V_m. So suppose y is in V_m. Then since y is the limit of (y_i), there is a k > 0 such that y_n is in V_m for all n >= k.

    Case 1: k <= m. Then y_m is in V_m, which is a contradiction since y_m is in K_m = K\(V_m).

    Case 2: m < k. Then y_k is in V_m, which is contained in V_k, so y_k is in V_k, same contradiction.

    Therefore y is not in V_m. Since m was arbitrary, we have that y_m is not in V_m for any m. But y is in K, which is the union of all the V_m. Thus we have a contradiction to our initial assumption that K_n was nonempty for all n.

    Thus there is an n such that K_n is empty, i.e. K\(V_n) is empty, i.e. K = V_n, which is the union of the O_i for 0 < i <= n. So {O_1, O_2, .... , O_n} is the desired finite subcover.

  • casco
    Lv 4
    3 years ago

    I forgot the data for subsequence convergence, in spite of the undeniable fact that it is simple experience that a subsequence that could only converge if the series that it relatively is derived from converges. It ties into the 2d part of your question. you comprehend that for a chain to converge, it is cut back ought to be equivalent to 0. in case you get rid of an element from a chain, it is cut back won't be able to boost.

Still have questions? Get your answers by asking now.