This piece was co-written by Sidita Kushi.
Over time, the United States has become comfortable using greater levels of force abroad. This was not the case at the country’s inception: in the first eras of statehood, the United States engaged minimally outside North America, as many of its conflicts related to defending its borders, the frontier wars, and westward expansion. The United States’ involvement in World War I and World War II ushered Washington into global leadership and much greater global engagement. After the Cold War and especially following the 9/11 attacks, the percentage of armed disputes in which the United States was involved that were initiated by U.S. adversaries dropped precipitously. The United States now finds itself in an era in which militarily, its adversaries are provoking it less frequently—and yet Washington is intervening with armed force more than ever.
This is an unfortunate trend. For evidence, look no further than the disastrous U.S. military interventions in Afghanistan, Iraq, and Libya. The overly frequent resort to use of force also undermines U.S. legitimacy in the world. As the U.S. diplomatic corps and American influence abroad shrink, the country’s military footprint only grows. Global opinion polls show that more than half of the world’s population now views the United States as a threat. There could be a change in the offing, however: as China becomes a more potent power, the United States will be more likely to refrain from engaging in foreign interventions because it could end in a showdown with another superpower. And that ultimately could lead U.S. policymakers to pursue diplomatic and economic initiatives that could bolster the United States’ soft power and global credibility.
RULES OF WARFARE
To put the use of U.S. force into context, it is helpful to consider the conditions that would traditionally legitimate it. In contemporary international law, whose foundations date from ancient times, a legitimate resort to violence must satisfy three fundamental conditions. First, force can be used only in self-defense or the defense of an innocent bystander. Second, it must, whenever possible, represent a response in kind. If someone throws a rock at another person, it would be acceptable for the victim to throw a rock back but not to use a firearm (even though injury from both rocks and firearms can be lethal). Third, the violence must be proportional to that attempted or completed, wielded only to the degree needed to reestablish the peace. Given this, if a member of one group is injured by members of another, it would not be legitimate for the victimized group to kill one of the aggressors. These principles apply to interstate violence as well as to interpersonal violence. But a Latin aphorism captures a tragic misconception that shapes conflicts among states: Silent enim leges inter arma, “in wartime, the law is silent.” More commonly, this is understood to mean that when survival is at stake, anything goes.
But of course, not all conflicts are existential. It is arguably legitimate to believe that when a state’s survival is at stake, anything does go. But survival is rarely at stake—and it certainly is not at stake in the conflicts that Washington has started in recent decades. Although the cumulative impact of this U.S. propensity to resort to force may be invisible to U.S. citizens and their representatives, it is clear to U.S. adversaries and even allies abroad. A Pew Research Center poll conducted between 2013 and 2018 found that U.S. prestige has declined precipitously: in 2013, 25 percent of foreigners considered U.S. power and influence to be a major threat, a figure that rose to 45 percent five years later.
Read the full piece in Foreign Affairs.