"The term “post-Christian” is used often, and therefore often contested. It suggests that our culture, the culture of Western civilization, is undergoing a fundamental transformation from something explicitly Christian to something explicitly secular. Normally, “post-Christian” also calls to mind a connection with the thought of the Ancients, especially through Christian intermediaries."
Sunday, July 19, 2015
Do We Live in a Post-Christian Culture?
Chase Padusniak at Intercollegiate Review