If there was no WW1 and the treaty of Versailles, Hitler would never have come to power. Britain and the rest of Europe are to blame for the rise of the Nazis.
If there were no WWI, Europe might still be under the Jack boots of old empires like the Ottomans, Austro-Hungarian, Tsarist Russia. They weren't much better than Hitler's Germany for most of the citizens. Change happens, more often than not by War but after change has happened, you can't stuff the genie back into the bottle and fantasize about the "what if".