Anonymous asked in EnvironmentOther - Environment · 2 months ago

We ask the real expert, will things on earth ever get better? Sorry only worse.?

2 Answers

  • 2 months ago

    Yes, they will get better just as they did after worse pandemics like the Black Death, Justinian's Plague and the Spanish Flu.

  • 2 months ago

    it usually takes a bloody war to make things better

Still have questions? Get your answers by asking now.