Christians have been hit with disturbing news over the past few years. We have realized that our culture is “post-Christian.” “In general Western culture is deeply post-Christian,” says Dr. William Lane Craig. Any observer would have to agree that, in its expressions of music, movies, television, or the written word, Western culture does not even … Continue reading The End of “Religion”?