Member-only story
Afghanistan War a Defeat for U.S. Empire, A Victory for American Nation and People
Does the United States need to be an imperial warlord on the world stage?
Should it be?
Is the role of the U.S. to police the globe and act as if it has complete license to levy its own political and moral judgments on other nations and pursue military interventions in, indeed occupations of, those nations on the basis of those judgments?
The ongoing withdrawal from Afghanistan, which has inspired so much controversy, will hopefully at some point inspire discussion of these questions that abound regarding U.S. foreign policy and how the U.S. conceptualizes its role on the world stage.
Maybe there is another way for the U.S. to participate in global politics that may in fact foster the spread of political democracy and economic justice.
Let’s face it. The history of U.S. behavior around the globe would be better characterized as one of empire-building than nation-building. Far from acting with the objective of spreading democracy, U.S. foreign policy and patterns of intervention around the globe indicate that the U.S. has been more interested in pursuing the political and economic interests of the wealthiest Americans, often to detriment of democracy and stability around the globe and often to the…