I guess my question is who gave the Americans the right? I say this as an American. But would not the world be a better place if we just minded our own business and quit nation building and stoking non existant fires?
I guess my question is who gave the Americans the right? I say this as an American. But would not the world be a better place if we just minded our own business and quit nation building and stoking non existant fires?
There are privileges to being an empire and the capitalists in the US continue to use that empire to get access to those privileges. Favorable trade, commercial, and financing terms are a big one.
Also the US war industry pushes the country to intervene. You can see how there are interventionist and isolationist movements in the US fighting right now over how much the US gets directly involved in Iran-Israel.