ERCOT is shifting to batch studies for large-load interconnection so it can process a surge of requests from data centers, crypto mines, hydrogen, and industrial projects without constant restudies.
“Batch Zero” will prioritize projects already deep in the queue, and will effectively decide which hyperscale data centers in Texas get firm megawatts first and which are delayed.
Big AI buyers like Google, Meta, Amazon, Microsoft, and OpenAI support the batch approach because the current one-by-one review leaves them in multi-year limbo on when and how much power they will actually get.
The new process will define staged “on-ramps” (for example, 100 MW now, full 500 MW after upgrades) and force developers to make financial commitments on a clear timeline, which matters for billion-dollar GPU and facility buildouts.
ERCOT’s transmission planning system was built for 40–50 large projects but is now dealing with 225 new requests in a year, so the bottleneck isn’t GPUs, it is grid capacity and upgrade timing.
For AI infrastructure, this means Texas build plans will hinge less on land and chips and more on where you land in ERCOT’s batches, how much firm capacity you secure early, and when transmission upgrades actually deliver.
The piece is worth a read if you care about how grid policy will gate Texas’s next wave of AI and data center deployments.
Source: ERCOT will soon have new way to consider data centers