Will Christianity outlive the West?
Now that the time of the West as a civilization is almost over and its end is on the horizon, what will be the fate of Christianity? Will it fall with it, or outgrow it? Within 50 years the majority of Christians will be in Africa, moving from the West to the East. So will Christianity move on and start another chapter of its history, leaving Western civilization as just one of its many homes, such as it was once Rome, the Spanish Empire, Britain, and America? Or it's fate sealed with that of the West?