I think the allies should have taken france during ww2
Maybe I'm a heartless person but I feel that the smart thing for the allies to do in WW2 was take control of France and keep it. After Hitler conquered France and then we conquered Hitler why did we just give France back? I think the Americans should have taken France and Italy and allowed the soviets to have the eastern countries. Why not? It would benefit America to do this. America could have swallowed half of Europe this way. We would have complete control of the world. We would never have to worry about losing another war