A few days ago, an op-ed piece was published in the Washington Post. It was authored by one of the men behind the book "Un-Christian." Gabe Lyons is an author, speaker and the founder of the Q Learning Community. In his essay, Lyons writes about all of the potential for the Gospel in a Post-Christian America. I'll post a link to Gabe Lyon's piece at the bottom of this bit. I highly recommend that you read it.
Also, you can find out more about Un-Christian here ---> http://www.unchristian.com/
BUT FIRST, I thought that I should post a relevant song which explains where I am at with the idea of America, or ANY other country being considered "Christian." You can hate me for my view point, but NOT for my sense of rhythm. (Gold lapels & wide legged pants are not included.)
When I was growing up, I believed that America was a "Christian" country, with all of the supposed blessings and special status that such a titled inferred. I couldn't quite figure out why anyone would say otherwise. To me it was pretty obvious. We were the greatest, richest, most powerful country in the world. I figured that you only got that because you had divine favor. And you only got divine favor by doing everything right in God's eyes.
The first significant crack in that facade came when I attended a prophecy conference held by Jerry Falwell in Jerusalem when I was going to school in Israel. Tim LaHaye was wrapping up the speaking for that evening, and he made the prophetic statement that at some point, God was going to send fireballs from heaven to kill off all of the earth's communists...including all of those in American academic circles. At that point, the room of the conference erupted into a prolonged standing ovation.
I thought to myself that, even if that were true, cheering over God killing and then damning tens of millions of people to hell was not the most Jesus-like thing I had witnessed. Needless to say, the cracks in my facade of supposed "Christian" America only got bigger from then on.
CAVEAT: I always offer this statement, because so many people are so easily prone to emotional reactions to the idea that America is not a Christian country.
I love the U.S. It's a great country. I am thankful beyond belief that I was born here and live here. Again, I LOVE this country! So please do not call me a bleeding heart, leftist anti-American. (Seriously, I'm moderately Conservative and NOT a fan of Michael Moore.)
That having been said, America is not now, never was, and never will be a "Christian" country. Neither will Canada, Bangladesh, Norway or Belize ever be Christian. Why? Because such an idea simply is not biblical. All the nations of the earth are simply "Kingdoms of Men." Biblically, the only Christian country is the Kingdom of God/Heaven. My constant frustration is that so many people of faith who live in the U.S. confuse these two realms.
At any rate, all of the statistical evidence out there now show that the U.S. is rapidly going the way of Europe and becoming a Post-Christian society. And to this, all I can say is "AMEN! Thank you Jesus for liberating your church from slavery and idolatry!!!!" I would much rather live in a nation that is completely pagan & honestly secular, where I have to actually live out my faith for all to see, than to live in a safe & politically easy idolatry where following Jesus is merely a check-off list.
Anyone who has paid any attention at all has seen the slide this nation has been on. And anyone who has paid any attention at all must have surely noticed that the American church has not just dropped the ball, but has also stuck a nail in that same ball and almost completely deflated it. (Again, I would reference the Un-Christian book.) All of the statistical data shows that people around 30 and younger get a very bad taste in their mouths when they think of Christianity.
For too long, thanks in large part to the Religious Right, (But certainly not solely their fault) the American church has sought to legislate the will of God. It would appear that this was much easier than going out and developing an authentic relationship with non-christians, loving them and then showing the Gospel to them by the way we live.
My delight in the idea that Christian America is dying is that it will force the people who call themselves the followers of Jesus to actually FOLLOW Jesus. Jesus wants his people to follow him into his kingdom. He wants us to participate with his father as he breaks into our fallen & broken world. You will always find God in the mud & blood, sweat & grime, pouring out his love to those at the end of their ropes.
God does show up in the halls of wealth and power, but not as often as he does in the trenches. The halls of power are ruled by sinful men & women. Since they have little use for God, other than as a prop for their election campaign, God doesn't seem to waste his time with them. (Honestly, I get tired of Jesus being used as a campaign slogan. And I truly believe that Jesus does too.)
I can only speak for myself, and only based on my personal experiences. But as such, when it comes to the idea that the golden calf called "Christian America" that we have raised up collapsing down around us, all I can say is "BURN BABY, BURN!"
Can I get an Amen?
Here is the link to Gabe Lyon's op-ed piece. http://onfaith.washingtonpost.com/onfaith/guestvoices/2010/10/the_good_news_about_the_end_of_christian_america.html#more
Also, here is a clip of an interview that Lyons did on CNN after the Un-Christian book first came out.