I've lived in multiple countries and holy shit dog culture is a cancer. It's more and more common to see dogs (with shitty anuses) in shopping carts, in cafes, in churches, museums, everywhere. It never ends.
The crazy thing is Westerners are starting to love dogs more than people and it really shows. People who take their dogs to grocery stores are psychopaths. You can't convince me otherwise. They need constant validation from their fur slave that is a disgusting mutant freak to be honest.