Talk:Fundamentalist Christianity

From Conservapedia
Jump to: navigation, search

Fundamentalism is a term, often used in the pejorative sense, referring to evangelical Christians. The term originated in the middle of the 20th century, when a group of evangelical thinkers promulgated a pamphlet under the name, "The Fundamentals of Christianity," in an effort to define themselves as opposing the liberalization of American social culture during that time period.

Merriam-Webster defines fundamentalism as 'A movement in 20th century Protestantism emphasizing the literally interpreted Bible as fundamental to Christian life and teaching' [1]

The word, over the course of its span, has come to mean any belief that has come to be oppressive or derogatory in nature to those who do not believe, such as the Islamic Fundamentalist of the Middle East, or the Westboro Baptist Church in Kansas that stages anti-homosexuality protests at military funerals.