Why is religion important?

I was raised as a baptist as a kid but never really understood why we went to church. I'm not trying to say that religion is stupid, but why is it necessary. Where did it come from? Everyone gets so defensive about their religion, but who actually knows what to believe until death?

5 Answers

Relevance
  • 8 years ago
    Favorite Answer

    Church provides a social network. It provides answers to difficult questions in life, and guidelines for how to live your life. People get defensive because they have been conditioned to never question what they are told about religion. I'm a Buddhist, and I'm still trying to decide where I fall on the spectrum between Atheism and Theism... you might find the talk by Richard Dawkins interesting in terms of why he think's religion is *not* important.

    I have no agenda, and I hope you come to a good conclusion whatever that may be.

  • Anonymous
    8 years ago

    Christian churches as a rule do not teach their doctrines. Christians generally have no idea what they officially are supposed to believe.

    In the 18th century Americans started moving west and three institutions went with them. There was vaudeville, traveling entertainment. There was lyceum, traveling education and culture. And there was the itinerant preacher, offering a new style of preaching called "hell fire and brim stone". It was very entertaining, only loosely based on scripture, and pastors didn't even try to compete. Instead they switched to preaching public morality and philosophy. Eventually an entire generation grew up not knowing the first thing about the denomination they claimed to believe. Since about 1970 a few people have dedicated themselves to rediscovering what the bible actually says, but there is considerable resistance based on "faith of our fathers", which is to say a famously brainless devotion to faith for its own sake.

    Read the bible from Romans to 2 Thessalonians over and over until you start to remember what it says. That is the part that applies to Christians. Keep reading until you start to notice it says some things that are different from what you have been told it says. Keep reading until you figure out that the bible is more reliable than those people who told you different things. Here is a book to help you study the bible. It's a free download and you can buy a hard copy at any bible book store.

    http://philologos.org/__eb-htetb/ "How To Enjoy The Bible"

  • Pancho
    Lv 7
    8 years ago

    To begin with, Christianity doesn't qualify as religion anymore, so let's set that aside. Religion has to do with ancient practices that bring one to a very high state and into communion with the Deity. That is accomplished through the deepest meditations -- some of the tribal cultures use things like ayuhuasca and so on - but the idea is to get oneself into an incredibly high state. All of that is forgotten nowadays of course, as evidenced by the very existence of (modern) Christianity, which is nothing more than a degenerated version of what Jesus taught centuries ago ...

  • 8 years ago

    I suggest you don't wait til death to find out. Where did it come from? Genesis 1: In the beginning God created the heaven and the earth. Fast forward: Jesus died on the cross so we could have eternal life. We are so unworthy and sinful, that he took the brunt of God's wrath because none of us are worthy. That was tough seeing as God took his only Son. The Christ you see pictured on the Cross is nothing like he would have really looked, and not one of us would have survived the beating he took. 1Co 6:20 For ye are bought with a price: therefore glorify God in your body, and in your spirit, which are God's. Get back to church and seek what you missed. You don't want to experience God's wrath next time around.

    Source(s): E Sword
  • How do you think about the answers? You can sign in to vote the answer.
  • Anonymous
    8 years ago

    Because united we stand:

Still have questions? Get your answers by asking now.