I'm interested in how degenerate and yet puritanical the U.S. is.

Books to get to the bottom of this, or American morality and ethics in general?