They don't wear make-up out of "insecurity," dear, not unless they happen to be insecure, and not all women are. (For that matter, plenty of men are insecure. It's not a sex-exclusive condition.) Make-up makes you look more polished, better groomed. In professional fields like mine, wearing some make-up is really de rigueur. As Nancy says, wearing make-up is just culturally normative for women.
However, I'll agree that it doesn't belong on places like beaches, and you shouldn't need to apply make-up to run to the grocery store or something.