The Gulf’s data center boom is really a test of whether digital ambition can survive the realities of energy, water, and infrastructure.
The Gulf’s data center surge is often presented as a story of digital ambition, cloud capacity, and artificial intelligence readiness, but that framing is incomplete. Beneath every announcement of new server halls, hyperscale campuses, and sovereign cloud regions lies a more consequential question: what kind of physical system will sustain this new digital age? In the GCC, data centers do not arrive in a vacuum. They arrive in economies already shaped by extreme heat, rising cooling demand, water scarcity, desalination dependence, and ambitious efforts to modernize national energy systems while diversifying away from older economic models. This Part I was written to place the data center boom in its proper context. It argues that the Gulf’s new digital infrastructure should be understood not merely as a technology investment, but as a test of energy policy, industrial design, and long-term resilience. The real issue is not whether the region can attract global cloud and AI players. It clearly can. The deeper issue is whether the GCC can power, cool, regulate, and benchmark this expansion intelligently enough to turn it into a strategic asset rather than an avoidable burden. That is why we recommend moving beyond headlines and into the harder terrain of megawatts, electricity demand, comparative ratios, and system consequences.
This is Part I of a two-part series examining the physical and institutional realities of the GCC’s data center boom, from electricity demand and cooling constraints to infrastructure design and governance. Part II will turn to the deeper economic question of whether this buildout produces a genuine knowledge economy or simply anchors global digital infrastructure without capturing its highest-value layers locally.
Reading Time: 40 min.
All illustrations are copyrighted and may not be used, reproduced, or distributed without prior written permission.
Summary: This Part I examines the rapid expansion of data centers across the GCC and argues that the region’s digital buildout must be analyzed through the lens of energy, cooling, water, and infrastructure planning, not technology promotion alone. Saudi Arabia, the UAE, Qatar, Bahrain, and Kuwait are all advancing cloud and AI-related facilities, but the significance of these projects depends less on headline investment values than on their implied electrical load, cooling requirements, and interaction with already stressed regional power systems. We show that the Gulf’s electricity context makes this issue especially important. GCC countries already face intense summer peak demand, large cooling loads, and growing desalination-linked electricity needs. In that setting, data centers become more than commercial real estate; they become a new class of strategic load. Therefore, we introduce a practical benchmarking framework to translate announced investment into estimated IT load, total facility load, annual electricity use, and ratios to national consumption, peak demand, household equivalents, and desalination-related energy use. Its central conclusion is that the GCC has a genuine opportunity to become a major global compute hub, but success will depend on whether the region treats data centers as part of an integrated system of power, water, cooling, and industrial policy.
The overall arc is: headline data center investment and visible digital buildout moving into the hidden physical realities of electricity demand, cooling intensity, and water-linked stress → quantitative benchmarking as the deeper layer that reveals how symbolic technology announcements translate into real system burdens → peak demand, desalination, and fossil-heavy generation as structural pressures that complicate the promise of frictionless digital growth → strategic competition, commercial ambition, and diversification policy as forces that push expansion faster than infrastructure logic may comfortably allow → legitimacy as distinct from mere investment attractiveness or formal approval, opening the question of whether the public and policymakers will accept data centers that deepen stress on power and water systems → domestic governance as a struggle over whether utilities, ministries, developers, or national planners should define the terms of growth → regional standards, grid conditions, cleaner power sourcing, and efficiency requirements as the practical guardrails that could turn expansion into disciplined infrastructure rather than prestige-led overload → broader implications for how Gulf digital governance must move beyond celebrating capacity and instead toward structured oversight of load, resilience, cooling, water, and state authority.
Note: The calculations presented should be read as a structured illustrative application of our analytical framework, anchored where possible in public data and supplemented where necessary by clearly stated planning assumptions.
What appears to be a technology buildout is increasingly becoming a strategic expression of national power, competitiveness, and long-term state direction.
The most important shift in the Gulf is that data centers are no longer being treated as back-office digital assets; they are being elevated into the same strategic category as ports, airports, industrial zones, and energy corridors. In Saudi Arabia, the Ministry of Communications and Information Technology said LEAP25 — the 2025 edition of LEAP, a major technology conference held in Riyadh, Saudi Arabia — brought more than $14.9 billion in AI-related investments and projects on the first day alone, including a KKR and Gulf Data Hub plan for up to 300 megawatts of data center capacity. Separately, NEOM and DataVolt announced a $5 billion first phase for what they describe as a net-zero AI factory campus in Oxagon, expected to be operational by 2028. Microsoft has also confirmed that customers will be able to run workloads from its Saudi Arabia East data center region from Q4 2026, with three availability zones in the Kingdom’s Eastern Province. These are not symbolic announcements. They indicate that compute capacity itself is becoming a pillar of national competitiveness in Saudi policy.
The UAE is moving along a parallel track, but with its own model of infrastructure layering. du announced a new hyperscale data center in the UAE at a cost of around AED 2 billion, with Microsoft as the main tenant. AWS, for its part, said its UAE region would support an average of 5,984 full-time jobs annually through investments of AED 20.1 billion over 2022–2036, and that the region launched with three availability zones. Bahrain already has an operating AWS region that the company says has underpinned the kingdom’s cloud-first strategy, while AWS highlighted Bahrain’s high cloud adoption in a 2025 public-sector case study. In other words, parts of the GCC are already in the operational phase of cloud regionalization, while others are entering a much larger AI-capacity expansion phase. That distinction matters because operating cloud regions are one thing; AI-heavy compute clusters with much higher densities and more demanding cooling loads are another.
Qatar and Kuwait show that the regional story is not confined to Saudi Arabia and the UAE. Microsoft opened its first global data center region in Qatar in 2022, Google Cloud opened its Doha region in 2023 with three zones, and Microsoft announced in March 2025 that it intends to establish an AI-powered Azure Region in Kuwait. Google Cloud also said in April 2025 that rapid expansion was underway in Kuwait as part of its broader infrastructure growth. Taken together, these moves show a GCC pattern: Bahrain and Qatar helped establish the region’s earlier wave of sovereign cloud presence, while Saudi Arabia, the UAE, and Kuwait are now pushing toward a newer phase defined by AI demand, higher-density compute, and larger electricity consequences. That is exactly why this topic deserves a serious analysis. The Gulf is not merely buying digital prestige. It is building the physical foundations of an AI economy, and every foundation has an energy cost.
What is revealed about the GCC’s shift from digital ambition to hard strategic infrastructure planning is: (i) GCC governments are treating data centers as strategic infrastructure rather than ordinary commercial real estate; (ii) Saudi Arabia is emerging as the clear scale leader in announced AI-linked data center capacity and investment; (iii) the UAE is pursuing a layered model that combines telecom operators, hyperscalers, and sovereign cloud-region development; (iv) Qatar and Bahrain demonstrate that the Gulf already has real operating experience in cloud localization and sovereign hosting; and (v) Kuwait’s recent moves confirm that the next wave of data center development is spreading across the region rather than remaining concentrated in only one or two flagship states.
In the Gulf, electricity and water do not operate as separate systems; each becomes more dependent as the other grows.
A data center boom would be a major energy topic anywhere, but in the GCC it lands on top of an electricity system already shaped by extreme summer cooling demand and water scarcity. The IEA says electricity demand in the MENA region tripled between 2000 and 2024, increasing by more than 1,000 TWh, and is projected to rise by another 50% by 2035. More importantly for the Gulf, the IEA says cooling now makes up nearly half of peak electricity demand in the region and one-quarter of annual electricity demand. That means the starting point is not an empty grid waiting for new digital loads. The starting point is an electricity system already wrestling with air-conditioning-intensive urban life in very hot climates. In such a system, adding AI-scale data centers is not just adding more demand; it is adding demand to a grid where the most difficult hours are already difficult.
The water dimension makes the issue sharper. The IEA says MENA produced 12 billion cubic metres of desalinated water in 2024, equivalent to the annual flow of the Euphrates River, and projects production to triple by 2035. It also says that cooling and desalination together are on course to account for close to 40% of projected electricity-demand growth in the region through 2035, while new digital infrastructure, including data centers, is also expected to add to rising demand. In other words, Gulf electricity systems are being asked to do several hard things at once: cool cities, desalinate water, electrify more activities, and now host AI-scale computation. That is why an approach about only “digital investment” would miss the true issue. The real policy question is whether compute demand arrives in a way that competes with cooling and desalination at the worst moments, or whether it can be designed to coexist with them.
The supply mix matters just as much as the demand story. The IEA says natural gas and oil still accounted for over 90% of MENA electricity generation in 2024, with gas at 70% and oil at 20%. That means much of the Gulf’s current incremental electricity, unless specifically tied to cleaner supply, still risks being fossil-heavy. This is why the same megawatt of new data center load can tell two very different stories. In one version, the GCC becomes the ideal host for next-generation digital infrastructure because it pairs reliable power, modern grids, gas flexibility, and fast-growing renewables. In the other, it locks AI expansion into a higher-emissions electricity profile while worsening summer peak pressure. That fork in the road is what gives this data center boom its urgency. The Gulf is not choosing whether data centers will matter. It is choosing what kind of electricity economy those data centers will create.
What OHK makes unmistakably clear about why energy stress changes the meaning of data center growth in the Gulf: (i) the GCC data center debate cannot be separated from air-conditioning intensity and rising summer peak electricity demand; (ii) desalination is not a peripheral concern but a central part of the region’s future electricity-growth story; (iii) new digital loads matter more in the Gulf because power systems are already structurally strained by heat and water needs; (iv) the carbon implications of data center expansion depend heavily on the electricity mix feeding those facilities; and (v) any serious GCC data center strategy must be framed as energy policy and infrastructure policy, not merely as digital-sector promotion.
Headlines may begin with investment, but their real significance emerges only when capital is translated into electricity demand, system load, and consequence.
One of the easiest mistakes in public discussion is to quote investment values without translating them into electrical consequences. Dollars make headlines, but megawatts reveal the physical burden. The KKR and Gulf Data Hub plan in Saudi Arabia points to up to 300 MW of capacity. If a 300 MW facility were utilized continuously across the year, it would draw roughly 2.6 TWh annually. The initial $5 billion DataVolt phase at NEOM is even more revealing because it is part of a roadmap toward a 1.5 GW AI campus; at full continuous utilization, 1.5 GW would correspond to about 13.1 TWh per year. Those are not trivial numbers. For context, DOE says all U.S. data centers together consumed 176 TWh in 2023. So, illustratively, one 1.5 GW campus running at a full-year equivalent load would amount to roughly 7.5% of all U.S. data center electricity use in 2023. Even the 300 MW project, on that same illustrative basis, would equal about 1.5% of U.S. data center electricity use in 2023.
Of course, this comparison needs care. Announced capacity is not the same as actual delivered energy, and nameplate scale is not the same as full utilization. Data centers ramp in stages, tenants fill capacity over time, workloads fluctuate, and hyperscalers rarely run every part of a campus flat-out from day one. That caveat is important because it stops us from becoming alarmist. But it does not weaken the basic point. It strengthens it. The right conclusion is not that every announcement equals immediate full load; it is that the upper range of infrastructure being contemplated in the GCC is already large enough to matter materially for electricity planning. When policymakers, utilities, and investors hear “data center,” they should stop imagining a warehouse with backup generators and start imagining a new industrial-scale customer class with load characteristics that can rival major factories or clusters of them.
This translation from finance to electricity also changes how the public should compare projects. A $2 billion hyperscale announcement in the UAE is financially impressive, but the harder question is what the eventual IT load, cooling load, and power profile will be. A cloud region with three availability zones is strategically important for sovereignty and latency, but its energy significance depends on the workloads it attracts. A sovereign AI region in Kuwait may begin as a policy milestone, yet its eventual energy relevance will be defined by how much high-performance compute it hosts. One must repeatedly ask: How many megawatts? What utilization? What time of day? What cooling design? What electricity source? Once those questions are used, the subject becomes much richer. It stops being a story about corporate logos and starts becoming a story about grid planning, load shape, and industrial design.
What OHK demonstrates about why megawatts and annual load matter more than investment headlines alone: (i) financial announcements become far more meaningful when translated into power demand rather than left as abstract capital totals; (ii) even a few hundred megawatts of continuous digital load is already system-relevant and not a marginal addition; (iii) gigawatt-scale AI campuses should be planned in the same category as other major industrial developments; (iv) announced nameplate capacity should be treated as a serious planning envelope even if utilization rises gradually over time; and (v) public understanding improves dramatically once money figures are converted into electricity demand and operational load.
Power only becomes meaningful when it is measured, compared, and translated into forms that allow disciplined judgment and credible planning.
One of the most useful additions to this topic that is missing in the plans underway is to give a simple empirical framework for moving from headline investment numbers to estimated physical consequences. That matters because the public discussion on data centers is still too often trapped in financial language. We hear about billions of dollars committed, strategic partnerships announced, or regional hubs being launched, but those figures do not automatically tell the what the project means for a grid, for annual electricity demand, or for comparisons with other economic uses of power. We may understand that a project is large, yet still have no way to judge whether it is moderately significant, systemically important, or transformational from an energy point of view. That is exactly where a quantitative approach becomes valuable. It gives discipline. It also lets us move from impressionistic writing into repeatable estimation, which makes the argument more credible and more reusable.
The right approach is not to pretend that one formula can perfectly describe every data center. It cannot. Actual electricity demand depends on the type of workload, the design of the facility, redundancy level, cooling architecture, ramp-up schedule, occupancy, and operating intensity. But that does not mean estimation is impossible. It means we should present these equations as planning equations or benchmarking equations, not as exact engineering guarantees. In fact, that distinction strengthens credibility. We are not claiming precision where precision is impossible. We are showing how to derive reasonable first-pass estimates from publicly available information.
1) The first and most useful equation is the one that converts capital investment into estimated IT load. Let:
I = total project investment in billions of dollars
c = capital intensity in millions of dollars per megawatt of IT load
Then the implied IT load can be approximated as: Estimated IT Load (MW) = 1000 × I / c
This is powerful because it immediately turns a vague investment number into something physical. A $2 billion project with a capital intensity of $10 million per MW of IT load implies about 200 MW of IT capacity. A $5 billion project at $12 million per MW of IT load implies roughly 417 MW of IT capacity. That does not tell us everything, but it gives a first real anchor. It helps GCC countries go beyond speaking in financial symbolism and start speaking in electrical substance.
2) The second step is to convert IT load into total facility load by using Power Usage Effectiveness, or PUE. Let:
PUE = total facility power / IT power
Then: Total Facility Load (MW) = IT Load × PUE
This matters because the servers are not the whole story. A data center also uses electricity for cooling, fans, pumps, power conditioning, lighting, and support systems. So if a facility has 200 MW of IT load and a PUE of 1.30, its approximate total operating load becomes: 200 × 1.30 = 260 MW
That is the number the grid ultimately cares about. This is one of the most important conceptual shifts: we should stop thinking only about servers and start thinking about the entire energy ecosystem surrounding those servers.
3) The third equation is the most useful one for annual comparison:
Let:
U = utilization factor or effective average operating intensity, expressed as a decimal
(For example, 0.85 means the facility averages 85% of its effective design operation across the year.)
Then: Annual Electricity Use (TWh/year) = IT Load × PUE × U × 8,760 / 1,000,000
If we substitute the first equation into this one, we get a highly practical derived form:
Annual Electricity Use (TWh/year) = 8.76 × I × PUE × U / c
That equation is perhaps the most useful because it directly links investment to annual electricity consumption. It tells the reader, in one line, how a billion-dollar announcement can be converted into a benchmark electricity estimate. For example, if:
I = 2
PUE = 1.30
U = 0.85
c = 10
then: Annual Electricity ≈ 8.76 × 2 × 1.30 × 0.85 / 10 = 1.94 TWh/year
That is a very readable number. It immediately allows comparisons with other electricity uses, other industrial facilities, or segments of a national grid.
4) From there, we can introduce benchmark ratios. These are especially important in a GCC context because we need not only a raw TWh estimate, but also a way to compare it to broader system uses.
The first is the national electricity share ratio:
Data Center Share of National Electricity (%) = Data Center Annual Electricity / National Annual Electricity × 100
The second is the peak load share ratio:
Let λ = coincidence factor with system peak
Then: Peak Share of Grid (%) = Total Facility MW × λ / System Peak MW × 100
This equation matters because annual electricity use and peak stress are not the same thing. A project may look manageable as a share of annual national generation while still becoming highly relevant during peak summer hours.
The third is a cooling burden ratio:
Let α = share of total electricity consumed by cooling systems
Then: Cooling Electricity (TWh/year) = Annual Data Center Electricity × α
This is extremely useful in the Gulf because it highlights the fact that a meaningful share of the facility’s power may be spent simply managing heat. In other words, some of the electricity is not being used to compute; it is being used to make computing possible in harsh thermal conditions.
The fourth comparison is the household-equivalent ratio:
Let H = average annual household electricity consumption in MWh/year
Then: Household Equivalent = Data Center Annual Electricity (MWh/year) / H
This gives an intuitive social comparison, though it should be used carefully. It is rhetorically powerful, but it should not be used to imply that households and data centers are interchangeable uses of power. The point is scale illustration, not moral simplification.
The fifth comparison is especially relevant for the Gulf: the desalination-equivalent ratio.
Let d = electricity intensity of desalination in kWh per cubic meter
Then: Equivalent Desalinated Water Volume (m³/year) = Data Center Annual Electricity (kWh/year) / d
This is a very strong GCC-specific benchmark because it links digital infrastructure to one of the region’s defining energy-water realities. It does not mean data centers “replace” desalination; rather, it shows what else the same electricity could theoretically support.
The real value of the equations and benchmarks is that they give a common measurement language. Once these equations are understood, they can evaluate future Gulf announcements more intelligently. They can ask not just How much money was announced? but also What implied load does that represent?, What annual electricity might it require?, How large is that relative to peak demand, cooling burdens, or desalination-linked electricity needs? That shift in questioning is exactly what elevates the data center news from commentary to structured analysis.
What a practical quantitative method to benchmark data center investment against physical system impacts provides: (i) a benchmarking framework gives an analytical discipline rather than leaving it as narrative commentary alone; (ii) investment figures become much more useful once converted into estimated IT load, total facility load, and annual electricity consumption; (iii) derived equations allow readers to compare data centers with national electricity demand, peak load, cooling burden, household-equivalent consumption, and desalination-linked energy use; (iv) the most useful formulas are not exact engineering guarantees but disciplined first-pass planning tools; and (v) this turns conceptual argument into a reusable method for evaluating future announcements.
The Abu Dhabi 5 GW AI campus project should be treated as major electricity infrastructure and not only as a digital-investment story.
The UAE project is no longer just a speculative technology story. It is now publicly described as a 10-square-mile AI campus in Abu Dhabi that is expected to host 5 gigawatts of data center capacity, with the first phase, Stargate UAE, described by Reuters as a 1-gigawatt project and the first 200 megawatts expected to come online in 2026. That matters because once a project is announced at that scale, it should no longer be discussed only in geopolitical or investment language. It should be discussed in electrical terms. That is exactly where the benchmarking framework becomes useful. It allows us to take a project that sounds grand in narrative form and restate it as a set of disciplined first-pass estimates about load, annual electricity use, peak burden, and wider system significance. That is the value of the framework set out above. It is not claiming exact engineering precision. It is creating a repeatable way to move from announcement to measurable infrastructure consequence.
The first and most useful equation in the framework converts capital investment into estimated IT load. In this case, however, the public reporting gives us the electrical scale more clearly than it gives us a settled final investment figure. So the most credible way to apply the equation is in reverse. If the reported 5 GW refers to total facility load, and if we use the same benchmark PUE of 1.30 from the framework, then the implied IT load is about 3,846 MW. If we then apply the framework’s capital-intensity benchmark of $10 million to $12 million per MW of IT load, the implied investment envelope becomes roughly $38.5 billion to $46.2 billion. That does not mean the project will land at exactly that number. It means that, under the framework’s own logic, the electricity ambition being discussed is consistent with a project measured in tens of billions of dollars rather than in symbolic headline billions.
The second step is to convert that scale into annual electricity terms. Using the framework’s own planning assumptions of PUE = 1.30 and U = 0.85, a 5 GW campus implies annual electricity use of about 37.23 TWh per year. That is the single most important benchmark in the whole exercise because it converts a large but abstract digital-infrastructure project into an annual electricity number that can be compared to grids, sectors, and other economic uses of power. On the same basis, the 1 GW Stargate UAE phase implies about 7.45 TWh per year, and the first 200 MW phase implies about 1.49 TWh per year. Once the project is translated into those numbers, it stops looking like abstract AI ambition and starts looking like major new industrial demand on a power system.
From there, the framework asks the right comparative question, which is not simply how large the project is in isolation, but how large it is relative to the wider electricity system. Using published 2024 utility figures as a practical benchmark, DEWA reported 59,594 GWh of annual demand and a 10.76 GW peak, while EWEC reported 107,729 GWh of global annual demand and an 18.623 GW global peak. Those figures are not a perfectly harmonised single national series, but they are still useful for first-pass comparison. On that basis, the full 5 GW campus would amount to roughly 22.3% of this indicative annual UAE electricity benchmark and about 17.0% of this indicative combined peak benchmark if coincidence with system peak were full, or about 15.3% if a 0.9 coincidence factor were used instead. That is the point where a data center announcement becomes systemically important rather than merely impressive.
The cooling burden ratio is especially important in a Gulf context because the energy cost of computing in hot climates is never just the energy cost of computation itself. The IEA notes that cooling already makes up nearly half of peak electricity demand in MENA and one-quarter of annual electricity demand. If we therefore apply a conservative planning assumption that 25% to 30% of the campus’s own electricity use is effectively tied to cooling, then the Abu Dhabi campus would imply roughly 9.31 to 11.17 TWh per year devoted to thermal management alone. That is an extraordinarily important result because it shows that a meaningful share of the electricity requirement is not being used to compute in the narrow sense—it is being used to make high-density computation possible under harsh climatic conditions.
The next step is to use the framework’s social and GCC-specific comparison ratios. If we use an illustrative household benchmark of 30 to 40 MWh per household per year, then 37.23 TWh corresponds to roughly 0.93 million to 1.24 million household-equivalents. Again, that should be used only as a scale illustration, not as a moral comparison. The desalination-equivalent ratio is even more revealing in a Gulf setting. The IEA notes that future desalination growth in the region is expected to be met by electricity-powered reverse osmosis technologies, and it gives a broad electricity range that makes a midpoint planning value of 4 kWh per cubic metre reasonable for benchmarking. On that basis, the same annual electricity used by the full Abu Dhabi campus would be equivalent to about 9.31 billion cubic metres of desalinated water per year. That does not mean data centers and desalination are interchangeable. It means that the electricity scale of the project is large enough to be discussed alongside one of the Gulf’s defining energy-water systems.
Bloomberg Law quoted the project’s 5 GW of demand as being “the size of powering the city of Miami.” Put it differently, the project is enormous not just by UAE standards but globally, but it needs two separate comparisons: one for instantaneous load and one for annual electricity use.
On the instantaneous-load side, DEWA reported Dubai’s 2024 peak demand at 10.76 GW, and EWEC reported its 2024 global peak demand at 18.623 GW. If you use those two official utility series together as an indicative UAE-wide benchmark, the comparison is about 17.0% of that combined peak. If you compare it only to EWEC’s own global system, where the campus would actually sit, it is even larger at about 26.8% of EWEC’s 2024 peak. That is why this project is better described as a major system load than as a normal industrial addition.
On the annual-electricity side, the campus running flat out all year would use about 43.8 TWh per year (43.8 TWh is the 100% utilization case and 37.23 TWh is the 85% case). Using the same indicative UAE benchmark as before, based on DEWA’s 59,594 GWh of 2024 demand plus EWEC’s 107,729 GWh of 2024 global demand, that would equal about 26.2% of that combined annual total. Using the more cautious planning assumption from earlier, with effective utilization around 85%, the number drops to about 37.2 TWh per year, or roughly 22.3% of that same benchmark. So the right takeaway is that the project is not “a few percent” of the UAE system. It is closer to one-fifth to one-quarter of the country’s electricity scale, depending on how hard it actually runs.
The comparison to the household-equivalent ratio, under the planning assumption of 85% utilization (37.23 TWh per year), gives an intuitive social comparison, but it should not be used to imply that households and data centers are interchangeable uses of power. The point is scale illustration, not moral simplification. The difficulty is that there is not one single recent official UAE-wide household average that is consistently published in a way that fits this equation neatly, so the cleanest benchmarking move is to keep H as an explicit planning assumption. The UAE government states that residents use roughly 20 to 30 kWh of electricity per day, and that makes a benchmark of around 30 to 40 MWh per household per year a reasonable illustrative range for a high-consumption Gulf household once translated into household terms. On that basis:
Household Equivalent = 37.23 million MWh / 30 = 1.24 million households
Household Equivalent = 37.23 million MWh / 40 = 0.93 million households
So the Abu Dhabi campus can be said to use electricity on a scale roughly equivalent to the annual consumption of about 0.93 million to 1.24 million households, depending on the household benchmark chosen. That is precisely the kind of comparison that helps grasp the scale without pretending that domestic and digital uses of electricity are socially or economically interchangeable.
The desalination-equivalent ratio, under the planning assumption of 85% utilization (37.23 TWh per year), is a very strong GCC-specific benchmark because it links digital infrastructure to one of the region’s defining energy-water realities. It does not mean data centers “replace” desalination; rather, it shows what else the same electricity could theoretically support. The IEA notes that, depending on technology and system boundary, desalination electricity intensity varies meaningfully: thermal plants typically consume about 2 to 4 kWh/m³ of electricity, while seawater reverse osmosis is roughly 2.5 to 6 kWh/m³ for the core desalination and related steps combined. If we use d = 4 kWh/m³ as a practical midpoint planning benchmark, then:
Equivalent Desalinated Water Volume = 37.23 billion kWh / 4 = 9.31 billion m³/year
If we show the sensitivity range instead, the comparison becomes even more revealing:
At d = 2.5 kWh/m³, equivalent volume = 14.89 billion m³/year
At d = 6 kWh/m³, equivalent volume = 6.21 billion m³/year
So the full Abu Dhabi AI campus would be equivalent to roughly 6.2 to 14.9 billion cubic metres of desalinated water per year, with a central planning benchmark of about 9.3 billion cubic metres per year. In a Gulf context, that is a useful comparison because it places digital infrastructure directly alongside one of the region’s most strategic electricity-linked systems.
What applying the benchmarking framework to the Abu Dhabi AI campus shows is that: (i) the project should be treated as major electricity infrastructure and not only as a digital-investment story; (ii) the reported 5 GW campus implies roughly 3.85 GW of IT load and about 37.23 TWh of annual electricity demand under standard planning assumptions; (iii) at full buildout it appears large enough to represent a material share of annual electricity demand and peak-system stress in the UAE; (iv) in a Gulf climate a very substantial part of its electricity burden may be tied simply to cooling; (v) the most useful way to understand such projects is not through headline investment figures alone but through a common measurement language that links money to megawatts, annual TWh, peak burden, and water-linked system tradeoffs; (vi) the project would amount to roughly 17% of an indicative UAE peak-load benchmark, around 27% of EWEC’s own 2024 peak demand, and roughly 22% to 26% of an indicative UAE annual electricity benchmark depending on utilization, which helps explain why analysts have reached for big-city analogies such as the Bloomberg Law comparison that 5 GW is “the size of powering the city of Miami;” (vii) on a household-equivalent basis its electricity use is on the order of roughly 0.93 million to 1.24 million households depending on the benchmark chosen; and (viii) in a GCC-specific water-energy comparison its annual electricity requirement is large enough to be expressed in desalination-equivalent terms measured in billions of cubic metres per year.
In hot climates, digital infrastructure is never only about computation; it is also about the growing burden of keeping that computation alive.
When people compare data centers to “all national electricity use,” they often miss the most important GCC-specific issue, which is when electricity is needed and what else is competing for it. The IEA says cooling already makes up nearly half of peak electricity demand in MENA and about one-quarter of annual electricity demand. That means a Gulf grid can look manageable on an annual basis while still becoming severely stressed on specific summer afternoons and evenings. A large data center that operates around the clock may seem easier to plan for than weather-driven residential demand because its load is more stable, but that stability cuts both ways. It means data centers do not disappear when the sun sets, and they do not naturally reduce demand when households, offices, malls, and transport systems are all pulling hard on the system. In the GCC, therefore, the right comparison is not simply “What percentage of annual generation will data centers use?” It is also “How will they interact with the grid’s hardest hours?”
Cooling technology makes this even more consequential. DOE says data center electricity is mainly consumed by equipment and by HVAC, with cooling commonly accounting for 25%–40% of data center energy use; in another DOE announcement, cooling was described as accounting for up to 40% of overall data center energy usage. In a hot Gulf climate, that should immediately change the framing. The question is not just how much power servers need; it is how much additional electricity is required to remove the heat they generate in one of the world’s hottest operating environments. This is precisely why the Gulf should care about liquid cooling, waste-heat strategies, ambient-temperature design, thermal storage, and flexible demand architecture. A 300 MW IT-heavy development is not just a server story. It is a heat-rejection story. That is why we should emphasize that the region’s comparative advantage in land, capital, and connectivity does not automatically solve the thermodynamics of AI infrastructure.
There is also a water-policy implication even when direct water use is not the main metric discussed publicly. The IEA’s warning on desalination growth means electricity planners are already preparing for more power-intensive water provision across the region. If data centers are sited, cooled, and contracted in ways that increase pressure during the same periods when cooling and desalination needs are climbing, then the issue becomes systemic. The Gulf would essentially be deepening three electricity-intensive loops at once: urban cooling, desalinated water, and AI-scale compute. That does not mean the region should avoid data centers. It means it should compare them to the loads that actually shape resilience. In the GCC, the smartest comparison is not with a vague national average. It is with the existing trio that already governs grid difficulty: summer cooling, water security, and peak reliability.
What OHK shows about why the Gulf should compare data centers to peak stress and system competition rather than only annual national electricity totals: (i) annual electricity statistics by themselves can understate the real system risk of new digital infrastructure; (ii) data center cooling demand is especially important in hot climates and should be treated as core infrastructure rather than a secondary operational detail; (iii) Gulf planners should evaluate projects based on load shape, peak coincidence, and thermal design rather than on capital value alone; (iv) reliability risks grow when computing load, urban cooling demand, and desalination-related electricity growth intensify at overlapping times; and (v) the most meaningful resilience benchmark in the GCC is behavior during the hardest peak hours, not only the annual energy average.
The Gulf’s opening lies not just in attracting capital, but in becoming the geography where strategic computing capacity chooses to concentrate.
It would be a mistake to write this subject as a warning only. The GCC also has real structural advantages that could make it one of the most important global locations for AI and cloud infrastructure. Saudi Arabia and the UAE are attracting capital at a scale that few other regions can match. Saudi Arabia’s 2025 announcements combine public ambition, sovereign alignment, and hyperscaler interest. The UAE’s cloud-region investments, telecom participation, and hyperscale positioning show a different but equally serious model. Qatar and Bahrain already provide operating examples of regional cloud infrastructure, while Kuwait is actively drawing in sovereign-AI platforms. This means the Gulf is not beginning from zero. It already has data-residency demand, government-backed digitization agendas, subsea and terrestrial connectivity ambitions, and strong reasons to host workloads closer to users in the Middle East, South Asia, and Africa. The Gulf can credibly argue that it is building a regional digital backbone, not just local server space.
There is also a genuine energy opportunity. The IEA says the region’s power mix is expected to change substantially through 2035, with natural gas meeting a large portion of demand growth, oil-fired output falling, solar PV capacity rising sharply, and nuclear power expanding. That does not solve the near-term power problem, but it does mean the Gulf has a credible pathway to pair digital expansion with better electricity architecture than the region had in the past. If policymakers use long-term power purchase agreements, dedicated renewable additions, gas-backed firmness, grid modernization, and high-efficiency cooling rules, then new data center demand could become a catalyst for better power planning rather than a burden on it. The same IEA analysis also says grid modernization and regional interconnections will be critical. That point matters because data centers are unusually sensitive to reliability and latency; they can become anchor customers that justify investments in transmission, redundancy, storage, and smarter system operations.
The key is to avoid a false choice between growth and restraint. The Gulf does not need to choose between saying yes to data centers and protecting its energy future. It needs to choose what kind of yes it offers. A weak yes is cheap land, generous incentives, and unspecified electricity. A strong yes is guaranteed efficiency standards, clear grid-connection rules, phased capacity delivery, renewable matching, advanced cooling requirements, and planning that aligns data-center growth with industrial policy. In that stronger model, data centers can serve as more than IT facilities. They can become anchors for power-market reform, storage deployment, clean firming solutions, and high-skill workforce development. We should acknowledge both truths at once: the Gulf has a serious opportunity and the opportunity will be wasted if energy is treated as an afterthought.
What OHK highlights about the Gulf’s real opportunity to become a major global compute hub if it aligns digital growth with power-system modernization: (i) the GCC has a credible chance to become a regional or even wider compute hub because capital, policy, and demand are increasingly aligned; (ii) existing cloud regions in Bahrain and Qatar give the region operational credibility rather than mere aspiration; (iii) large data centers can justify and accelerate better grid infrastructure if treated as anchor loads rather than opportunistic consumers; (iv) the Gulf’s competitive advantage will depend on the quality of its energy architecture and not only on sovereign investment capacity; and (v) a disciplined, infrastructure-aware yes to data centers is strategically far more valuable than a fast but poorly structured expansion.
When digital expansion moves faster than the energy logic, grid readiness, and system discipline required to sustain it, pressure stops producing strength and begins creating structural fragility.
The biggest long-term danger is not that the GCC suddenly becomes overrun with data centers. It is that it builds them under a weak energy logic: abundant land, fast approvals, generous incentives, and too little discipline on how electricity is sourced, timed, and cooled. If that happens, digital infrastructure could expand faster than power-system resilience. The IEA already warns that MENA demand is rising rapidly and that cooling and desalination are major growth drivers, while digital infrastructure is an additional source of demand growth. Add to that the fact that the region still relied on oil and gas for over 90% of electricity generation in 2024, and the risk becomes obvious. The Gulf could end up celebrating AI industrialization while quietly deepening fuel burn and peak stress. That would be especially shortsighted in countries that are also trying to free oil from the power sector for higher-value use or export. A weakly planned data center boom could work against broader economic strategy rather than reinforce it.
Another risk is that public conversations can become distorted by hype. Announcements measured in billions of dollars sound transformative, but without transparency on actual power profiles, utilization ramps, cooling systems, backup architecture, and renewable matching, policymakers cannot tell the difference between prestige announcements and productive infrastructure. The DOE’s U.S. work is useful here because it demonstrates how quickly data center electricity can scale once AI workloads accelerate: from 58 TWh in 2014 to 176 TWh in 2023, with estimates of 325–580 TWh by 2028. The lesson is not that the Gulf will automatically follow the U.S. curve. The lesson is that once AI demand takes off, energy forecasts can move quickly and conventional planning can lag. That is why Gulf utilities and regulators should assume that compute demand can surprise on the upside, and should build governance that prevents speculative over-commitment from outrunning actual system readiness.
There is also a subtler risk: locking in yesterday’s efficiency assumptions for tomorrow’s compute. DOE has already emphasized the growing importance of better cooling because HVAC can consume 25%–40% of data center energy. In a Gulf context, that means design choices made now will shape energy performance for years. If operators build around mediocre air-cooling assumptions, weak thermal standards, and limited flexibility, the region could hardwire unnecessary energy penalties into infrastructure that is supposed to last decades. By contrast, if regulators push for advanced cooling, staged development, and better demand management from the start, then the GCC can leapfrog older models. So the critical note should not be anti-growth. It should be precise: the wrong kind of speed is expensive. Fast buildout without disciplined energy design turns a strategic opportunity into a structural liability.
Concerns about the danger of building data centers quickly but under the wrong energy and cooling logic: (i) Gulf governments should be more concerned about weak energy design than about data center growth in itself; (ii) large investment announcements without transparent load and infrastructure details can mislead both markets and policymakers; (iii) AI-driven demand can rise faster than conventional electricity-planning cycles are designed to handle; (iv) poor cooling choices made early can lock in long-term inefficiency and unnecessary energy waste in hot climates; and (v) strategic speed is only beneficial when matched by equal strategic discipline in sourcing, cooling, and system planning.
Strategic infrastructure requires more than expansion alone; it depends on who holds the authority to shape access, rules, and system control.
A strong GCC data center strategy would begin by refusing to treat every megawatt as equal. Governments should differentiate between ordinary cloud hosting, latency-sensitive sovereign workloads, and energy-intensive AI training or inference clusters. These categories do not impose the same burden on grids, nor do they create the same national value. For example, sovereign public-sector workloads may justify premium reliability and local hosting even at modest scale. Large AI campuses, by contrast, should face much tighter scrutiny over grid impact, efficiency, and electricity sourcing because their economic spillovers do not automatically outweigh their system costs. That means incentives should move away from generic tax friendliness and toward performance-based infrastructure policy: power-use effectiveness targets, advanced cooling requirements, staged energization, grid-connection sequencing, and transparency on expected load growth. The GCC has enough state capacity to do this well. The issue is whether it chooses precision over publicity.
Second, governments should explicitly tie new data center approvals to power-system improvements. The IEA says grid modernization and regional interconnections are critical for the future of electricity in MENA. That point should not sit in a separate energy report while digital infrastructure moves on another track. A smarter model would require very large data center developments to contribute to, or at least align with, new renewable additions, storage, transmission reinforcement, thermal-efficiency upgrades, and flexible operating protocols. In practice, that could include dedicated renewable buildout, time-sensitive demand arrangements, thermal storage for cooling, or backup strategies that reduce pressure during system stress. None of this requires saying no to growth. It requires making growth conditional on strengthening the system it depends on. If the GCC gets this right, data centers can become partners in grid modernization rather than passive consumers of ever more electricity.
Third, the Gulf should make cooling innovation a regional policy priority rather than leaving it to individual developers. DOE’s work shows why this matters: HVAC can consume a large share of total data center energy, and DOE has specifically funded projects to cut cooling energy for data centers. In the Gulf, where ambient conditions are harsher, cooling should be treated almost like a national industrial capability. The region should want to lead in liquid cooling, high-temperature operation standards where feasible, waste-heat use cases, district-cooling interfaces, water-conscious design, and thermal optimization software. The Gulf’s competitive advantage will not come only from hosting data centers. It can also come from becoming the place that learned how to run AI infrastructure efficiently in extreme heat. That would be a global contribution, not just a regional one.
How we lay out about the policy architecture required for a smarter and more selective GCC data center strategy: (i) different classes of data center workload should not receive identical incentives or grid priority because their system value and system burden differ; (ii) performance-based approvals are stronger than generic infrastructure promotion and should shape the next phase of expansion; (iii) very large projects should be tied directly to cleaner power additions, grid reinforcement, and resilience upgrades; (iv) cooling innovation should be elevated into a regional industrial capability rather than left to fragmented developer choices; and (v) the GCC can build a distinct competitive advantage by becoming exceptionally good at operating efficient compute infrastructure in extreme climates.
The real significance of data centers lies not inside the facility alone, but in the urban, economic, and regional systems they reorganize.
It is tempting to classify data centers as part of the digital economy and leave the matter there. But in the Gulf they sit at the crossroads of urbanism, industrial policy, energy security, and geopolitical positioning. Cities in the GCC already carry some of the world’s most intense cooling burdens. Water security already depends heavily on energy-intensive desalination. Governments are trying to diversify economies while improving digital sovereignty, attracting foreign capital, and building new industrial clusters. Into that setting arrives AI infrastructure, which requires land, power, cooling, fiber connectivity, high-value equipment, specialist operations, and very high reliability. That is why the right approach should consider what kind of built environment and economic model the Gulf is choosing. The server hall is only the visible object. The true story is the system assembled around it.
There is also a geopolitical layer. Bahrain, Qatar, Saudi Arabia, the UAE, and Kuwait are all, in different ways, trying to reduce latency, strengthen digital sovereignty, and position themselves as secure regional nodes. Microsoft’s announcements in Qatar, Saudi Arabia, and Kuwait, AWS’s regions in Bahrain and the UAE, and Google Cloud’s presence in Qatar and expansion in Kuwait all reflect that competition. These companies are not merely selling cloud services; they are embedding themselves into regional economic geography. And GCC states are not merely buying computing. They are competing to host the places where computation happens. This matters because infrastructure location shapes regulatory influence, talent pipelines, resilience, and investment gravity. A country that hosts critical AI and cloud infrastructure gains more than server racks. It gains a stronger role in the region’s future digital map.
That broader frame helps explain why considerations matter beyond energy specialists. The GCC is effectively negotiating a new social contract between digital ambition and physical constraint. It wants AI leadership, cloud sovereignty, industrial diversification, and global relevance. But it must reconcile those goals with peak electricity stress, fossil-heavy power systems, and water insecurity. That tension is not a weakness in the story; it is the story. The most interesting question is not whether the Gulf can build data centers. It clearly can. The deeper question is whether it can make them serve a larger regional model of resilience, efficiency, and strategic autonomy instead of merely becoming another electricity-hungry layer laid on top of already stressed systems. That is Part I real center of gravity.
OHK underlines why the Gulf’s data center boom is really an urban, industrial, geopolitical, and environmental story at the same time: (i) data centers in the GCC should be understood as urban and industrial infrastructure rather than as narrow technology assets; (ii) hosting compute capacity also means gaining influence over regulation, resilience, and regional digital geography; (iii) the infrastructure race is simultaneously economic, geopolitical, and environmental in character; (iv) the most important GCC question is not whether it can adopt technology but whether it can design systems around that technology intelligently; and (v) the strongest approach connects cloud and AI strategy to the deeper physical realities of energy, water, and the built environment.
The final question is not whether digital infrastructure can expand, but whether it can be balanced against the physical systems that must sustain it.
The strongest conclusion is not that the GCC should slow down. It is that the GCC should become far more intelligent about how it accelerates. The facts already point to a region moving decisively into the data center era. Saudi Arabia has paired huge AI-investment announcements with large-capacity ambitions and concrete cloud buildout timelines. The UAE is deepening both hyperscale and cloud-region capability. Qatar and Bahrain have already demonstrated that local cloud regions can operate and support broader digital transformation. Kuwait is actively positioning itself for an AI-powered cloud future. Meanwhile, Google Cloud has signaled continuing physical expansion in Kuwait, while the MENA electricity outlook has become more demanding, not less. The pieces are clearly assembling into a regional compute geography. What remains unsettled is whether that geography will be merely large or genuinely well-designed.
Our core argument, then, should be simple but forceful. The Gulf’s next data center wave must be judged by more than ribbon cuttings, tenant names, or capital spending. It should be judged by whether new facilities reduce or worsen peak-system stress, whether they rely on better cooling logic, whether they catalyze cleaner and more resilient electricity, and whether they complement rather than compete destructively with the region’s huge cooling and desalination needs. The IEA has already made clear that the MENA electricity system is entering a period of fast growth, rising complexity, and large infrastructure requirements, while DOE’s U.S. data show how quickly AI can drive data center electricity upward once scale arrives. The GCC still has time to learn from that trajectory instead of replaying it blindly. That is precisely why now is the moment to look deeper into the data center boom, not later.
A persuasive closing tone should therefore avoid both techno-optimism and techno-pessimism. The Gulf should not romanticize data centers as frictionless engines of progress, and it should not reject them as inherently unsustainable. It should govern them like the strategic loads they are. If Gulf states demand better cooling, better power sourcing, better grid alignment, and better transparency, then data centers can become part of a stronger development model. If they do not, then the region risks mistaking digital visibility for physical readiness. This is why the data center conversation belongs not at the margins of Gulf policy, but near its center. It sits exactly where the region’s future will be decided: at the intersection of energy, water, infrastructure, and economic reinvention.
What conditions under which the GCC can turn data center expansion into a durable strategic advantage: (i) the GCC data center boom is real, region-wide, and increasingly tied to AI-scale infrastructure rather than conventional hosting alone; (ii) energy, cooling, and desalination pressures make the Gulf’s version of this story structurally different from many other world regions; (iii) the most revealing comparisons are those tied to peak demand, thermal burden, and grid resilience rather than headline investment alone; (iv) the region has a serious opportunity to become a global model for efficient compute in hot and water-constrained environments; and (v) long-term success will depend less on how many data centers get built than on the quality of the electricity, cooling, water, and governance systems built around them.
Part II of this series will ask a different but equally consequential question: what does AI and data center expansion actually mean for the GCC knowledge economy? It will explore whether hosting infrastructure creates real domestic value, whether GDP gains are shallow or transformative, whether local firms capture meaningful upside, whether talent formation becomes genuinely local or remains largely imported, whether the GCC evolves into a user market, a hosting market, or a true capability market, and what long-term sovereignty and competitiveness will require beyond physical buildout alone. These questions deserve a dedicated analysis in their own right. To force them too far into Part I would risk weakening its center of gravity around energy, infrastructure, and governance, while still treating the knowledge-economy issue too briefly to do it justice.
At OHK, we help clients move beyond the surface of AI and data center expansion to understand the deeper systems that determine long-term success. Our advisory work connects digital ambition with the realities of electricity, infrastructure, cooling, governance, investment strategy, and public legitimacy so that decisions are grounded not only in technological potential, but in physical, economic, and institutional feasibility. We support clients in turning complex questions about AI, data centers, and digital infrastructure into clear strategic frameworks that link capability, location, load, resilience, regulation, and long-term value. Whether the challenge is national strategy, infrastructure planning, governance design, or investment positioning, our aim is the same: sharper judgment, stronger systems thinking, and more durable outcomes. Contact us to learn how OHK can support your next phase of AI, infrastructure, and strategic transformation.