Skip to main content
News Directory 3
  • Home
  • Business
  • Entertainment
  • Health
  • News
  • Sports
  • Tech
  • World
Menu
  • Home
  • Business
  • Entertainment
  • Health
  • News
  • Sports
  • Tech
  • World
Decentralized AI Training: A Sustainable Solution for Energy Demands - News Directory 3

Decentralized AI Training: A Sustainable Solution for Energy Demands

April 8, 2026 Lisa Park Tech
News Context
At a glance
  • The escalating energy requirements of frontier artificial intelligence models are pushing the industry toward decentralized training.
  • Decentralization allows compute processes to be located where energy is already available.
  • Training AI models typically requires clusters of closely connected GPUs.
Original source: spectrum.ieee.org

The escalating energy requirements of frontier artificial intelligence models are pushing the industry toward decentralized training. This approach distributes the computational workload across a network of independent nodes rather than relying on massive, centralized data centers, potentially reducing the carbon footprint and electrical strain associated with the AI boom.

Decentralization allows compute processes to be located where energy is already available. This includes utilizing dormant servers in research laboratories or computers in solar-powered homes, avoiding the need to construct new energy-intensive infrastructure that would require significant power grid scaling.

Hardware Solutions for Distributed Compute

Training AI models typically requires clusters of closely connected GPUs. However, as large language models grow in size, single data centers are becoming insufficient. To address this, technology firms are developing hardware to pool the power of geographically separated facilities.

Hardware Solutions for Distributed Compute

Nvidia has released Spectrum-XGS Ethernet for scale-across networking, designed to provide the performance necessary for large-scale single job AI training and inference across separated data centers. Similarly, Cisco has introduced the 8223 router to connect AI clusters located in different regions.

Beyond corporate data centers, some companies are leveraging idle compute through a GPU-as-a-Service model. The Akash Network operates as a peer-to-peer cloud computing marketplace where providers with underused GPUs in offices or small data centers can rent their capacity to tenants.

If you look at [AI] training today, it’s very dependent on the latest and greatest GPUs. The world is transitioning, fortunately, from only relying on large, high-density GPUs to now considering smaller GPUs.

Greg Osuri, cofounder and CEO of Akash

Algorithmic Shifts and Software Coordination

Moving training away from a single site requires significant software changes to handle communication and reliability. Federated learning is one such method, where a central server distributes a global model to participating organizations. These participants train the model locally on their own data and share only the resulting model weights back to the central entity for aggregation.

However, decentralized training faces challenges with communication costs and fault tolerance. In traditional AI training, if a single node fails, the entire batch often needs to be restored.

To mitigate these issues, researchers at Google DeepMind developed DiLoCo, a distributed low-communication optimization algorithm. DiLoCo organizes compute into islands of compute, where groups of chips of the same type work together. These islands are decoupled, meaning they synchronize knowledge only periodically. This structure allows islands to perform training steps independently and prevents a single chip failure from interrupting the rest of the system.

A subsequent version, Streaming DiLoCo, further reduces bandwidth requirements by synchronizing knowledge in a streaming fashion in the background, rather than stopping the computational work for communication.

Several organizations have already implemented these techniques:

  • Prime Intellect used a DiLoCo variant to train its 10-billion-parameter INTELLECT-1 model across three continents and five countries.
  • 0G Labs adapted DiLoCo to train a 107-billion-parameter foundation model using segregated clusters with limited bandwidth.
  • The open-source deep learning framework PyTorch has included DiLoCo in its repository of fault tolerance techniques.

The Path Toward Residential Data Hubs

The long-term goal of decentralization is to move AI training to where the energy is, rather than moving energy to where the AI is. This is exemplified by the Akash Starcluster program, which aims to convert solar-powered homes into functional data centers by utilizing desktops and laptops.

Turning a home into a provider node requires more than just a consumer-grade GPU and solar panels. Participants would need to invest in redundant internet and battery backups to prevent downtime. Akash is currently working with industry partners to subsidize battery costs and simplify the packaging of these requirements for homeowners.

The network is developing backend capabilities to allow homes to participate as providers, with a target date of 2027. The program also envisions expanding into other solar-powered community sites, such as schools.

While some industry players are still scaling traditional data centers—such as xAI’s Colossus in Memphis and OpenAI’s Abilene Stargate—the push for decentralized training offers a potential alternative to the grid constraints that currently bottleneck AI growth.

Share this:

  • Share on Facebook (Opens in new window) Facebook
  • Share on X (Opens in new window) X

Related

ai energy, data center, large language models, training

Search:

News Directory 3

ByoDirectory is a comprehensive directory of businesses and services across the United States. Find what you need, when you need it.

Quick Links

  • Disclaimer
  • Terms and Conditions
  • About Us
  • Advertising Policy
  • Contact Us
  • Cookie Policy
  • Editorial Guidelines
  • Privacy Policy

Browse by State

  • Alabama
  • Alaska
  • Arizona
  • Arkansas
  • California
  • Colorado

Connect With Us

© 2026 News Directory 3. All rights reserved.

Privacy Policy Terms of Service