No, the US has been an empire from the start. Unless you don’t count conquering and colonizing the indigenous peoples because they aren’t “civilized” or something.
This is most probably what’s really happening. The same way Democrats try to get right wing extremists in Republican primaries to win because they see them as easier to win against.
No, the US has been an empire from the start. Unless you don’t count conquering and colonizing the indigenous peoples because they aren’t “civilized” or something.