While browsing Reddit a few days ago (always a dangerous thing to do), I came across an article originally posted in the atheism sub-reddit. This article stated that according to recent survey findings, the number of Americans who did not claim a particular religion was on the rise. In 2014, about 21% of all Americans claimed no religious preference, an increase of 7.5 million people since 2012.
Are these statistics cause for concern? Is America losing its religion?
The atheist commenters on Reddit expressed delight at this growing trend with remarks such as these:
“This is a very promising trend and it will grow exponentially.”
“I’d call this an overall gain for society.”
“It’s a start!”
While these comments seem harsh or severe to Christians, there may be a deeper truth behind them.
I read something the other day by a Nazarene pastor in Seattle that really stuck with me. He essentially said that unchurched people in the Pacific Northwest consider becoming a Christian a “step down” morally. In other words, non-Christians feel like they have a higher sense of morality and stronger ethics than most “church folks.”
In many non-Christians’ eyes, Christians are typically bigoted, closed-minded, ignorant, and illogical. They don’t want to be associated with a Church that has too often been power-hungry and corrupt. They have encountered too many religious people that are legalistic, hypocritical, self-righteous, and Pharisaical.
On the other hand, non-believers see themselves on a higher moral plane. They are fighters for justice and peace. They strive for equal-rights for all people, including minorities and those in the LGBTQ community. They work hard to take care of the environment. They have a deep appreciation for both science and the arts. Even the divorce rates of non-Christians is the same as (or lower than) those that claim Christianity. (So much for the sanctity of marriage!)
This same pastor in Seattle also made another bold statement:
“Atheism is not my problem. The lack of credible Christianity is.”
While many evangelical Christians may be alarmed to transition into a Post-Christian American society, I wonder if it could be a good thing.
If we look back over the history of Christianity, we see that the Church grew the most when it was in the minority. Communities of Christians were able to show the world a new way of living, one that was counter-cultural. The Way of Christ was intriguing to many, and people wanted to be a part of it. In times and places where Christians were in the margins, Christ came in and moved. It seems that God works best in those situations.
In a Post-Christian world, our faith will be put to the test. We will be held accountable for our actions. The world will be watching us to see if we are authentic believers. We will truly be called upon to be salt and light to a dark world. We will need to be the defenders of justice and fighters for peace. We will need to offer the world something different, a new way of living. We will discover what it means to be a Christian in more than just name.
When Christianity is no longer “trendy,” we may find out who the true followers of Christ are.
And personally, I’m ready for that day to come. Are you?