Tuesday, January 13, 2009

Living in a Post-Christian Society

When you search Ebay for 'silver paten' (which is the plate used in the Holy Eucharist), 90% of the returns are Wiccan, not Christian.

There are so many indications that our culture has become Post-Christian, which means that Christianity is no longer the dominant model for religious faith--or at the very least, no longer the only acceptable option. This represents such a drastic change from the past that I don't think we as 'Church' have even begun to grasp its significance.

What does this mean for us? First, I think it calls us to strive for new ways of expressing our faith. We can no longer be certain that a newcomer to our church knows the basic story of Jesus, or any of the other 'whys' or 'hows' of our faith.

Second, it requires us to be able to articulate why our faith--and our faith community--is important to us. Do we go to church out of habit? Or are we involved in our church for deeper reasons. And if we are, what are those deeper reasons that keep us in the family of faith? We must be willing to share these things with others.

Do you agree that America is Post-Christian? And if so, what other ways does this cultural reality call the Church into new ways of being?