Is the United States a Christian nation?
- DandintacLv 64 years agoFavorite Answer
It depends on what one means by that. If you mean only that a majority of the population self-identifies as Christian, then yes, we are.
But I don't think this is what people mean when they stridently claim "this is a CHRISTIAN Nation."
Some try to claim that he United States was founded on Biblical principles. But this is not true.
The Treaty of Tripoli, although not a founding document, was a treaty signed unanimously by the Senate, explicitly states that the US was not founded as a Christian nation: Here's the actual text: "As the government of the United States of America is not in any sense founded on the Christian Religion..." Although this treaty is not a founding document, many of the men who signed it were among the Founding Fathers, and this clearly signifies their thinking on the matter.
Furthermore, the US Constitution, the single most important founding document of our nation, mentions religion NOWHERE except to restrict it's influence in our government!! There is the "no religious test" clause, (Article VI, Section 3, which explicitly says: "...but no religious test shall ever be required as a qualification to any office or public trust under the United States." If this were a "Christian Nation", it would be the opposite. It would say that all office holders must be Christian.
Also, there is the Establishment Clause of the First Amendment: "...Congress shall make no law respecting an establishment of religion, or prohibiting the free exercise thereof..." This has been repeatedly described as a "wall of separation between church and state"--by key founding fathers, such as Jefferson and Madison, and also by repeated SCOTUS rulings over generations of Constitutional Law.
Clearly, we are not in any legal or meaningful sense--"A Christian Nation". If a majority are Christian--so what? Clearly those who are so vigorous in their insistence that we are have a political agenda, and wish to see their particular religious beliefs, that are currently blocked by the Constitution, to be accepted unquestionably as legitimate laws.
- lapinLv 54 years ago
There are more Christians (Catholics and Protestants) in the US than any other religion so I would say yes.
- MoondoggyLv 74 years ago
Culturally, yes. Officially, no.
- Bobby JimLv 74 years ago
It started out that way, but who knows now?
- How do you think about the answers? You can sign in to vote the answer.
- markLv 44 years ago
- TrilobitemeLv 54 years ago
Most are Christian but we have freedom of religion. Love your neighbor as yourself means to love people who dont agree with you
- Anonymous4 years ago
No. Enough said.
- G CLv 74 years ago
It always has been.
- 4 years ago
The United States of America established it's judicial laws after the Judea driven laws of The Old Testament while our culture became Christian driven. So America's God has been the God of Abraham.Source(s): The God Yahweh our Intelligent Designer and the missing link in the chain of evolution.
- Tony RLv 74 years ago
The vast majority of biblical laws are not punishable by the government. There are only 2 that I can think of that parallel the bible. You can't murder, you can't steal. All the other things the bible condemns are legal to do.
Saying it's a Christian nation is a huge stretch. You can say there are Christians living in it.
- Anonymous4 years ago
No, it used to be, but those days are long over and so are the high times of USA