The USA did not win World War Two. The vast majority of casualties occurred on the Eastern Front between the Red Army of the Soviet Union and the German Wehrmacht. It was Hollywood which claimed the war was won by the USA (and not, as was the case, the Russians). Lend lease – far more generous to the USSR than that which bankrupted Britain – was of course a huge help.
The USA has not won a war with a near peer enemy since the 19th century. The Spanish-American war of 1898 ended Spanish imperial claims in the Americas and the Philippines.
Keep reading with a 7-day free trial
Subscribe to Frank Wright to keep reading this post and get 7 days of free access to the full post archives.