As policymakers debate the future of data centers in the United States, the discussion often centers on the ambitions of the largest technology companies. That focus misses a basic point: Small businesses rely on reliable and affordable access to cloud and AI tools. Data center policy is not only about large firms building more infrastructure. As our recent letter to the House Science Committee explained, small businesses are often the leading users, developers, and deployers of AI-enabled services, and their ability to innovate depends on robust digital infrastructure.

That dependence is becoming more important, not less. Federal and independent energy analyses now project sharp growth in electricity demand from data centers over the next several years. Data centers could account for roughly 9 to 17 percent of U.S. electricity use by 2030.

Those numbers have made data centers a visible policy issue. But they should not lead policymakers to treat data center growth itself as the problem. The better question is whether the United States will build the energy and computing infrastructure needed to support AI deployment across the economy while protecting households and communities from unfair burdens. That is a manageable policy challenge. It is not a reason to slow or freeze deployment.

Small businesses have the most to lose from getting this wrong. Before the cloud, a small software company needed to spend heavily on on-premises servers, IT staff, and maintenance just to get started. Cloud infrastructure changed that. It lowered entry costs and allowed smaller firms to access world-class computing power on demand. AI is creating a similar transition. Most smaller firms do not build advanced AI systems in-house. They access them through cloud services and AI tools that run on data center infrastructure. If that infrastructure becomes harder to build, slower to expand, or more expensive to power, the practical result is that AI becomes harder for smaller firms to access.

That is why pro-data center policy should be viewed as pro-small business policy. Startups and independent developers benefit when upstream infrastructure is scalable, reliable, and affordable. They are harmed when policy uncertainty delays construction, increases energy costs, or discourages investment in the systems they rely on. In that sense, restrictive data center policy does not mainly restrain large firms. It raises input costs for smaller ones.

The emerging debate over moratoria illustrates the risk. Some advocates have called for national or state-level pauses on new data center development. New York lawmakers have advanced legislation that would impose a moratorium on new permits while further reviews are conducted, and Maine lawmakers have passed a bill that would temporarily pause approval of new large-scale data centers above 20 megawatts through October 2027, pending gubernatorial review. These approaches may be framed as precautionary, but in practice they would operate as a brake on the broader innovation ecosystem. They would not pause demand for AI tools. They would simply make the infrastructure underlying those tools scarcer and more expensive.

As policymakers consider how to respond to rising data center demand, the key point is that delay and uncertainty carry real costs. Reviews and planning processes matter, but when infrastructure becomes too difficult or too slow to build, the effects do not stop with large technology companies. They flow downstream to the small businesses that depend on cloud computing, AI tools, and reliable digital services to compete. The central policy challenge is not whether data center growth should occur. It is whether the United States can support that growth in a way that is predictable, reliable, and fair, while ensuring that the costs of expansion are managed responsibly rather than pushed onto smaller firms and other customers.

A pro-data center position does not require pretending there are no tradeoffs. Data centers do create new demand on the grid. They do require land, water, and power. And communities are right to ask how those costs will be managed. But the answer should be to structure policy so that large new loads support new supply, fund needed delivery upgrades, and enter transparent rate arrangements that protect other customers. Done well, this approach can support both infrastructure growth and public confidence. Done poorly, by contrast, the burden falls disproportionately on everyone else, including the small businesses that are already operating with tighter margins.

If the United States wants to lead in AI deployment, it cannot treat the physical systems underlying that deployment as an afterthought. Other countries are moving quickly to expand capacity. A policy environment defined by delays, fragmentation, and moratoria would weaken the conditions that have historically allowed smaller American firms to build, scale, and compete.

Data centers are part of the enabling infrastructure that allows smaller firms to use AI, reach customers, lower costs, and compete. If policymakers want an AI economy that remains open to innovators of all sizes, they should treat pro-data center policy as part of a broader pro-innovation agenda. That means building more infrastructure, not less, while making sure the rules are clear, the costs are allocated fairly, and the benefits of that infrastructure remain broadly accessible.