Key Points
- AI workloads pushing rack densities from 15 kW to 60-120 kW per rack
- Facilities efficient two years ago now face unsustainable power costs
- Industry leaders call this the defining inflection point for data centres
Two years ago, a data centre that consumed 15 kilowatts per server rack was considered efficient. Today, artificial intelligence workloads are demanding four to eight times that amount, and the facilities that once represented best practice are now haemorrhaging money on electricity bills. For operators across India, the maths has changed faster than their infrastructure.
Jaideep Roy, Director of Business Development at Vertiv, puts it bluntly. The problem is no longer about raw computing power. It is about the energy waste that comes with it. “Facilities that were efficient two years ago are now facing power bills that threaten profitability and grid constraints that block expansion,” he said. “We see this as the defining inflection point for the entire sector.”
This inflection point raises a question that every data centre operator in India must now answer: can the country’s digital infrastructure grow fast enough to meet AI demand without overwhelming the power grid and the planet? The answer will shape not only the data centre industry but also the cost and availability of every AI-powered service that Indian consumers and businesses increasingly depend upon.
Why AI workloads are breaking legacy infrastructure
The core challenge is physics. Traditional data centres were designed around air cooling, a technology that works well when each rack of servers generates modest heat. AI workloads, particularly those training large language models or running real-time inference, generate far more heat per square metre than conventional computing tasks.
Roy explained that AI-driven workloads are rapidly pushing rack densities beyond the limits of legacy air-cooled designs. The numbers are stark: densities are rising from around 15 kW to as high as 60-120 kW per rack. To put this in perspective, a single high-density AI rack can now consume as much power as 40 average Indian households.
Air cooling systems simply cannot remove heat fast enough at these densities. The result is either throttled performance, as servers slow down to avoid overheating, or enormous energy bills for ever more powerful air conditioning. Neither outcome is sustainable.
Liquid cooling and integrated design as the new baseline
The industry’s response is a fundamental redesign of how data centres manage heat. Liquid cooling, which pipes coolant directly to processors, can remove heat far more efficiently than air. Vertiv and other infrastructure providers are positioning this technology not as an upgrade but as a prerequisite for AI-scale operations.
“Our integrated liquid cooling architectures and intelligent power distribution platforms are purpose-built for exactly this reality,” Roy said. “They slash energy waste at the rack level, maintain rock-solid uptime and let customers support dramatically higher compute density without proportional increases in power draw or infrastructure footprint.”
Advertisement
The key phrase is “without proportional increases”. If a data centre can quadruple its computing output while only doubling its energy consumption, the economics shift dramatically. For operators facing both rising AI demand and pressure from regulators and investors on carbon emissions, this efficiency gap is existential.
Sustainability as operational necessity, not marketing
The framing of sustainability is changing. For years, data centre operators treated environmental credentials as a communications exercise, useful for annual reports and investor presentations but secondary to uptime and cost. That hierarchy is collapsing.
Narendra Sen, CEO and founder of RackBank Data Centers, argues that sustainability must now be embedded into every rack, every watt and every design decision. “Digital growth and environmental stewardship are not in conflict; they are complementary,” he said. “The industry must collectively reaffirm its commitment to decarbonising digital infrastructure at scale and building an AI-ready future that the planet can afford.”
Pratap Mane, President and Country Head for India at Colt Data Centre Services, frames this as a question of long-term viability. “The industry’s long-term success will be defined not just by its ability to scale, but by how efficiently and sustainably that scale is delivered,” he said.
Sustainability, in his view, is becoming fundamental to the future of digital infrastructure rather than a complementary objective.
The counterargument: can green infrastructure keep pace with demand?
Not everyone is convinced that efficiency gains can outrun demand growth. The challenge is straightforward: if AI adoption continues to accelerate, even highly efficient data centres will consume vastly more power in absolute terms. A facility that uses half the energy per computation still doubles its total consumption if it handles four times as many computations.
India’s power grid, while expanding, faces its own constraints. Renewable energy capacity is growing but remains insufficient to meet existing demand in many states. Data centres competing for grid access with manufacturing, agriculture and residential users may find that efficiency is necessary but not sufficient.
The industry’s response has been to pursue distributed infrastructure. Rather than concentrating capacity in a few massive facilities, operators are building smaller data centres closer to users in emerging cities. This reduces transmission losses and, in theory, makes it easier to integrate local renewable generation.
Blue Cloud Softech Solutions, for instance, is developing what it calls Edge AI capabilities alongside its Blue Energy platform, designed to enable renewable and distributed energy adoption at scale.
What this means for Indian businesses and consumers
The infrastructure decisions being made today will determine the cost and performance of AI services for years to come. If data centre operators successfully transition to efficient, renewably powered facilities, AI-driven applications from healthcare diagnostics to agricultural forecasting could become cheaper and more widely available. If they fail, rising energy costs will be passed on to customers, and grid constraints could limit where and how quickly new capacity can be built.
For businesses considering AI adoption, this creates a new due diligence question. The sustainability credentials of cloud and data centre providers are no longer just about corporate responsibility; they are a proxy for operational resilience and long-term cost stability.
Roy’s characterisation of this moment as a defining inflection point is not hyperbole. The choices made now, in cooling technology, power sourcing and infrastructure design, will lock in patterns of energy consumption for decades.
The operators who treat efficiency as foundational rather than optional will likely be the ones still operating profitably when the next generation of AI workloads arrives demanding still more power.
Your Questions, Answered
Why are AI workloads creating problems for data centres?
AI tasks generate far more heat than traditional computing. Rack power densities are rising from 15 kW to 60-120 kW, overwhelming air cooling systems designed for lower loads.
What is liquid cooling and why does it matter?
Liquid cooling pipes coolant directly to processors, removing heat more efficiently than air. It allows data centres to handle AI workloads without proportional increases in energy consumption.
How does data centre energy use affect Indian consumers?
Rising data centre costs are passed on to cloud and AI service users. Efficient, sustainably powered facilities should keep AI services affordable and widely available.
Can renewable energy meet growing data centre demand in India?
Renewable capacity is growing but faces constraints. The industry is pursuing distributed infrastructure and local generation to reduce reliance on the central grid.


