Search Results
7/14/2025, 8:55:01 PM
Why do people still deny that Africa was better off under European rule? As soon as our labour to civilize that continent started bearing fruit, we called it quits and left them to their own devices; for many reasons not our own of course. But why is this fact still such a taboo? My coworkers, for example, will just not hear it.
Is it because they're propagandized through and through?
Is it because they're propagandized through and through?
Page 1