Americas allusion of moral and economic superiority or greatness has been completely lost. Fuck, even militarily , when was the last time we won a war? America is weak and easy to infiltrate government. America is owned by the world and is now cutting itself off to the world economically.

Comments