Anonymous
Anonymous asked in Society & CultureReligion & Spirituality · 1 decade ago

Do you think it would make a difference if Christianity was taught in schools?

And as a devouted Christian would you like that even though it violates separation of church and state?

Update:

Yeah, and not to mention, it's UN-AMERICAN.

25 Answers

Relevance
  • 1 decade ago
    Favorite Answer

    Did you notice how our schools went down hill so fast after they took prayer out of schools. Did you notice that after 911 that even George W. Bush told people to pray in the schools even though it was unconstitutional. Christianity should be taught in schools even if it violates seperation of church and states. These laws are what you call "unrighteous" laws which stem from a devilish wisdom. The laws of love, justice, and mercy are "higher" than goverment laws, therefore they take precedence over any man-made law. There was a story in the Old Testement about Elijah the Prophet who was being mocked by the children of a certain village. Elijah called down a curse on the children and 2 bears came out of the woods and mauled the children. Do you know why that happened? Because the parents did not teach their children about God nor to reverence Him or His prophet. As a result of not teaching them God's ways the children suffer the wrath of God.

  • Realistically the answer to your question can't be given. Unless someone knows what you mean by Christianity, and what you mean by taught (as in to what degree). To teach the concepts or to fail someone because they can't write a paper that shows they have adopted that mindset. And what aspect of Christianity would be relevant to teach inside of schools anyway, unless an ethics class were to be added. As far as in say the subject of cosmology or biology I don't necessarily think it should be taught, but I definitely think a more accurate critique in the fatal flaws that have cropped up and not been addressed in Evolution should also be taught, and I definitely think elements of the evolution account that have been disproved (such as the drawings of the comparative embryos which were shown to be doctored over 120 years ago) should definitely by taken out of the textbooks. I don't think a particular worldview should be taught, so by that same standard I think the reigning philosophy of the day should be yanked as well, materialistic naturalism.

    As far as the church and state thing, the constitution does not say that. There is a non establishment clause which say the state will not create an official state religion. They most certainly had England and the Anglican church in mind when they wrote this. The goal was not to keep any church out of the govt, but the govt out of the church.

  • Anonymous
    1 decade ago

    No way. The constitution states that there is a separation of church and state. Publich schools should never teach religion, that is what catholic schools and private schools can do, but not a public school. I swear, that if public schools start teaching religion, i will run for office and put a stop to it, if its the last thing i do. Keep religion in the home and church. Thats where it belongs.

  • jimbob
    Lv 6
    1 decade ago

    I am a teacher in a public school, and I spend a good deal of time each school year teaching Christianity. We begin by tracing the history of the theological forebears of Christianity: the Jews. During the study of Rome, I set the stage for the arrival of "messiah," we read the words of Christ as recorded in Holy Scripture and discuss their meaning, consider the significance of the Resurrection, and trace the historical development of the Christian Church.

    How can a child understand American society if s/he does not know what Christianity is? I would be doing my students a grave disservice if they left my classroom ignorant of perhaps the single most important social influence upon our nation. We also spend significant amounts of time studying Islam, Hindusim, Buddhism, and Shinto.

  • How do you think about the answers? You can sign in to vote the answer.
  • I would hate to see what the school system would do to Christianity, truth be told. I don't want it in public schools.

    I do think kids should be allowed (not forced) to pray. I think they should be told in science class that evolution is the best explanation scientists have, but that it's not the only possibility. A child shouldn't have to choose between science and his/her conscience. I don't need ID taught.

    I think the arts should make a comeback and some form of logic and ethics. Restore literature that deals with morals and social responsibility.

  • No I would not like this. Whos style of Christianity would we teach and why does anyone think they have the right to push this on people? I am a Catholic and I do not think that anyone has this right. There are all faiths and no faiths in the U.S. and they all have rights. If you wish to you can teach your kids at home that's what I do, you can send them to Christian schools, many do, or you can teach them after school like so many others. Neither do I think children should be taught that religion is a myth.

  • Anonymous
    1 decade ago

    Yes. But the teaching would need to be "non"denominational. It might be best to simply add creation (Biblical account) in schools if evolution is taught. A non-essential religion class could be offered before or after school. Give students a time and place to pray if they wish and put ALL the words in the pledge of allegiance etc.

  • 1 decade ago

    I believe there is only one way to God, and that is by Jesus Christ.

    That being said....I also am realistic enough to realize that you can't forcefeed anyone this information. People have to either believe it, or not. If Christianity was taught in school, then to be fair, then every other religion on the face of the earth would want equal time as well.

    I don't think it's practical. If parents wish their children to have an education based on biblical principals, they are going to have to avoid putting their children in public schools.

  • 1 decade ago

    Actually it can be taught in schools if it is taught as literature or social studies, or history. Yes it would probably do quite a bit of good. At least then more people would know what they are talking about. Too many people hate the bible which they have never read.

  • 1 decade ago

    It is taught in schools... there are lots of religious schools throughout the USA and the world...

    It used to be taught in USA public schools... or the Bible was anyway... as a way to learn to read and write... because it was the One book that was in most homes.

Still have questions? Get your answers by asking now.