Where is the best place to live in the United States?
I don't plan on returning to Naples because the life there is hard and I already have an education established in the United States.
I want to move somewhere in the southern US, preferentially an hour or so from the ocean, a place/state that has a low crime rate and where I can establish my career. I'm going to college to be a Veterinarian but I want to move to a place where the pay is high and live expenses are...reasonable.
I've considered Texas, North Carolina, California, and Florida.
But because I was born in New Jersey, I've heard that Southerners HATE Northerners. So I wouldn't want to move to a place where I'm not wanted. On top of it, I'm not sure if I want to move to California with all that's going on in the West coast.
So with all that I've listed, where do you think is the best place to live in on outside states in the south? (I hate winter time. But if the winters are short, I guess I can deal with it. Hahaha.)