The following is the Nov. 14, 2022, Congressional Research Service report, Defense Primer: U.S. Policy on Lethal Autonomous Weapon Systems.
From the report
Lethal autonomous weapon systems (LAWS) are a special class of weapon systems that use sensor suites and computer algorithms to independently identify a target and employ an onboard weapon system to engage and destroy the target without manual human control of the system. Although these systems are not yet in widespread development, it is believed they would enable military operations in communications-degraded or -denied environments in which traditional systems may not be able to operate.
Contrary to a number of news reports, U.S. policy does not prohibit the development or employment of LAWS. Although the United States does not currently have LAWS in its inventory, some senior military and defense leaders have stated that the United States may be compelled to develop LAWS in the future if U.S. competitors choose to do so. At the same time, a growing number of states and nongovernmental organizations are appealing to the international community for regulation of or a ban on LAWS due to ethical concerns.
Developments in both autonomous weapons technology and international discussions of LAWS could hold implications for congressional oversight, defense investments, military concepts of operations, treaty-making, and the future of war.
Then-Deputy Secretary of Defense Ashton Carter issued DOD’s policy on autonomy in weapons systems, Department of Defense Directive (DODD) 3000.09 (the directive), in November 2012. U.S. defense officials have stated that they plan to release an updated directive by the end of 2022.
Definitions. There is no agreed definition of lethal autonomous weapon systems that is used in international fora. However, DODD 3000.09 provides definitions for different categories of autonomous weapon systems for the purposes of the U.S. military. These definitions are principally grounded in the role of the human operator with regard to target selection and engagement decisions, rather than in the technological sophistication of the weapon system.
DODD 3000.09 defines LAWS as “weapon system[s] that, once activated, can select and engage targets without further intervention by a human operator.” This concept of autonomy is also known as “human out of the loop” or “full autonomy.” The directive contrasts LAWS with human-supervised, or “human on the loop,” autonomous weapon systems, in which operators have the ability to monitor and halt a weapon’s target engagement. Another category is semi-autonomous, or “human in the loop,” weapon systems that “only engage individual targets or specific target groups that have been selected by a human operator.” Semi-autonomous weapons include so-called “fire and forget” weapons, such as certain types of guided missiles, that deliver effects to human-identified targets using autonomous functions.
The directive does not cover “autonomous or semi-autonomous cyberspace systems for cyberspace operations; unarmed, unmanned platforms; unguided munitions; munitions manually guided by the operator (e.g., laser- or wire-guided munitions); mines; [and] unexploded explosive ordnance,” nor subject them to its guidelines.
Download the document here.