Is it normal to think that the american empire should expand?
I think it's better for much parts of the world like south america, mexico, canada, africa, asia and maybe europe to be controlled by the Empire of the United States of America?
I only want to help the weak people, and knowing outside America there is much problems we should teach them how to do everything.
We are the number one military in the world, so strong the world has never seen, if any country would threaten us we would raise our fists and they would cower because everyone knows how strong we are.
I think the American Empire is already big because we have military bases like in every country in the world, only officially we aren't the rulers of those countries yet.
Is it normal to think this or do you think we should just keep to ourselves, because outside the USA most people aren't smart enough and uncivilized anyway that it's pointless?