Paul Hayden Miller – The Hill April 16, 2026
The Department of Defense does not always announce structural shifts loudly; often, it buries them in the dense columns of budget lines where only the most attentive analysts can find the seismic activity. The $1.5 trillion spending proposal for fiscal 2027 contains precisely such a shift — a profound and subtle transformation that effectively reorders the American approach to conflict.
Central to this plan is the Departmental Autonomous Warfighting Group — an organization established late last year with a modest budget of $225 million. For the 2027 fiscal year, the Pentagon has requested $54.6 billion for this organization, representing a staggering 24,000 percent increase in funding. That single line accounts for nearly 15 percent of the total reconciliation package. It exceeds the gross domestic product of many small nations and is higher than the entire budget request for the Marine Corps ($52.8 billion).
Internal documents indicate the intent to transform the group into a unified combatant command, a joint entity that would coordinate drone, aircraft, and vessel operations across all warfighting domains. This shift mirrors previous military evolutions, specifically the establishment of Space Command in 2019 and the elevation of Cyber Command in 2017.
Historically, Congress has authorized these specialized commands when fragmented service approaches created redundancy or dangerous gaps. The same logic applies here: By consolidating these capabilities, Secretary of War Pete Hegseth wants to streamline the development of autonomous systems, ensuring the service branches do not pursue conflicting tactical goals or incompatible technical standards.
The reflects the hard lessons learned in modern conflicts, particularly the ongoing struggles in Ukraine and Iran. Chief Technology Officer Emil Michael has observed that these wars routinely involve thousands of low cost systems engaging against each other in highly contested environments.
To maintain a competitive edge, the Pentagon launched the Replicator program with the ambitious goal of deploying hundreds of thousands of one-way attack drones by 2028. However, early efforts faced substantial hurdles regarding hardware reliability and supply chain bottlenecks that delayed delivery targets. These shortcomings led to a fundamental realization within the leadership: Hardware is secondary to the AI software that drives it.
The current strategy treats artificial intelligence and physical autonomy as a tandem force, where the software is the primary strategic asset. This perspective has created a unique friction point between the War Department and the private sector — specifically with the AI company Anthropic. Whereas the military requires flexible, decisive models for high stakes environments, Anthropic has maintained strict red lines regarding the use of its Claude model.
This impasse prompted the Department of War to designate certain domestic AI firms as supply chain risks — a move that highlights the growing chasm between Silicon Valley and the national security community. If a model is too restricted to perform in a combat environment, it becomes a liability rather than an asset.
The policy landscape remains contentious as Congress prepares the next National Defense Authorization Act. While the technological advantages are evident, the legislative challenges are substantial. Armed Services Committee leaders like Senator Roger Wicker and Representative Mike Rogers have cautioned against making such massive structural shifts without a clear strategy that accounts for ethical and operational oversight. They have drawn clear lines on executive branch activism regarding autonomy, requiring that any major push receives rigorous scrutiny.
Rep. Rob Wittman (R-Va.) has echoed these concerns, noting that while the military must move fast, it cannot afford to abandon the principles of accountability that define American governance.
Internationally, the pressure is even more pronounced. Recently, 156 nations supported a United Nations General Assembly resolution expressing deep concern over the risks of an autonomous arms race. These nations fear that removing humans from the loop will lower the threshold for conflict and lead to unpredictable escalations.
The U.S. was among the minority that declined to support the resolution, citing the necessity of maintaining a technological lead against such competitors as China and Russia, who are pursuing their own autonomous capabilities with little regard for international norms. Current U.S. policy prohibits the employment of lethal autonomous systems without senior official approval, but critics argue this is a temporary safeguard that could easily be swept away by the speed of machine warfare.
History suggests that as technical capabilities drift, legal frameworks must evolve to provide clear definitions of what constitutes an autonomous weapon. The transition to a unified command for autonomy is not merely a budgetary or structural change; it is a recognition that the nature of power has shifted from physical platforms to the cognitive software that controls them. Failure to adapt to this reality would leave the U.S. holding an expensive manned fleet in an age of attritable, intelligent swarms. The window for this transformation is closing — the fiscal 2027 budget request is the most significant signal yet that the Pentagon is ready to step through it.
Success will depend on more than just the $54.6 billion requested; it will require a new type of coordination between the warriors who fight and the engineers who build the tools. As the Department of War navigates the friction with firms like Anthropic and the skepticism on Capitol Hill, it must articulate a vision where autonomy enhances human judgment rather than replaces it.
If it succeeds, the unified command will become the backbone of American security for the next century; if it fails, then the machines will indeed be at the helm, but we may not like where they are steering us.
Paul Hayden Miller is a defense policy advisor and defense technology investor.