Americans Are Losing Faith

Last week Newsweek published a cover story titled “The End of Christian America.” A pretty ominous title if you ask me. The article has already come up in several conversations I’ve had in person and online.

The premise of the article surrounds polling information that says in 1990, 86% of Americans claimed to be Christians, whereas today only 76& claim that title. A 10% drop in 20 years, which is a pretty significant drop.

The author works around this idea of a “Christian America” in this way:

What, then, does it mean to talk of “Christian America”? Evangelical Christians have long believed that the United States should be a nation whose political life is based upon and governed by their interpretation of biblical and theological principles. If the church believes drinking to be a sin, for instance, then the laws of the state should ban the consumption of alcohol. If the church believes the theory of evolution conflicts with a literal reading of the Book of Genesis, then the public schools should tailor their lessons accordingly.

Read the full article HERE

A few quick thoughts of mine:

  1. If his above description continues to describe how Christianity most commonly shows itself in America, then you can expect more people avoiding it.
  2. If 76% of Americans still claim to be Christians, I think it would be fair to say that we aren’t in a post-Christian America yet. However, the days of Judeo-Christian values (I think that is what people call it) being strongly valued are long gone.
  3. This is a rebellion away from religion.
  4. The church hasn’t done a very good job the past 20 years.
  5. I agree with Dan Kimball. He said that this is more of a rebellion against certain expressions of Christianity, not a rebellion against Christianity as a whole.

Any thoughts on the declining number of self-proclaimed Christians in America?