What will be left after the West has gone?
For the last 70 years, one thing has stood between tyrants and their ability to have their way in this world; American hegemony. Despised from Europe to the Middle East, it nevertheless managed to rid the world of the Third Reich, the USSR and yes, even Saddam Hussein. So, how is it that the land that afforded the West so much of the freedom it has enjoyed is not much appreciated around the globe? The only way I can explain it is to say that I do not believe the concept of "The West" even exists much in Europe anymore, let alone any other part of the world. I lived in Finland for a few years, and people there hardly knew what I was talking about when I referred to the West. This may be a result of a historical disconnect from the concept, but I think it more attributtable to a conceptual disconnect from history. Wikipedia explains the West to be: Western European or Western European-derived nations which enjoy relatively strong economies and stable governments, allow freedom...