I guess my question is who gave the Americans the right? I say this as an American. But would not the world be a better place if we just minded our own business and quit nation building and stoking non existant fires?
I guess my question is who gave the Americans the right? I say this as an American. But would not the world be a better place if we just minded our own business and quit nation building and stoking non existant fires?
Absolute whitewashing of the USian crimes against humanity all over the first half of the 20th century. Examples: big stick ideology
The US also constantly did shit like this in the Americas all over the 19th century, see United Fruit Company or Military Government of Cuba.