×
×
homepage logo

Tech Matters: An Uber model for data centers?

By Leslie Meredith - Special to the Standard-Examiner | May 12, 2026

Photo supplied

Leslie Meredith

Uber upturned the taxi industry by opening paid ride-sharing service to individuals with a clean car and driving record. Today, Uber is available around the world, has become both a noun (“I’ll get an Uber”) and a verb (“Let’s Uber to the party”) and continues to offer rides at less cost than a conventional taxi. Can the data center market follow a similar model? A trio of companies including Span, Nvidia and PulteGroup think so.

Enter the concept of distributed data centers. The demand for computing power for AI is pushing the supply of electricity beyond its limits. The U.S. Department of Energy projects that electricity used by data centers could nearly triple by 2028. In Utah, AI infrastructure proposals are growing rapidly in both size and power demand.

The proposed Stratos Project in Box Elder County, backed by Kevin O’Leary and approved by the Military Installation Development Authority in April 2026, would combine a massive AI data center with its own energy campus on about 40,000 acres, more than twice the size of Ogden. At full buildout, the project could use up to 9 gigawatts of electricity, more than double Utah’s current power use, which is why residents and state officials are pressing for more scrutiny before it moves ahead.

But the simple answer of building more data centers as more people use AI comes with a number of challenges.

Large data centers are expensive to build, often require a new power source and can take years to get connected to the grid, creating what the industry calls a speed-to-power gap for AI compute. At the same time, there is not enough electricity infrastructure in many areas to support the additional demand.

That shortage has turned some data center developers into power plant developers. Microsoft, for example, signed a deal to help restart a reactor at Three Mile Island to secure carbon-free electricity for its AI operations.

Still, a shortage of transformers and other equipment has led to rising costs and project delays. Transformer lead times are now measured in years rather than months, and recent industry estimates suggest that 30% to 50% of large data center projects planned for this year could be delayed waiting for power infrastructure and electrical equipment to catch up.

This is where the Uber model comes in. The idea is to create networks of home- and small business-based processing nodes that together would create distributed data centers using excess electricity in local grids and land already owned by users.

Leading the charge is Span, a San Francisco company known for smart electrical panels that monitor and manage home energy use. Its new XFRA units are small outdoor AI compute nodes, roughly the size of an HVAC condenser, designed to sit outside a home or small business. Span is partnering with Nvidia for the chips and PulteGroup, the parent company of Pulte Homes, as well as other homebuilders, to test the concept in about 100 newly built homes later this year. The company has not yet released where the test will take place.

Span says it could install 8,000 XFRA units about six times faster and at one-fifth the cost of a conventional centralized data center with similar computing capacity. The company says most homes use only part of the electrical capacity already allocated to them, leaving room for AI computing without major infrastructure upgrades.

The XFRA unit would operate independently outside the home, processing AI workloads inside the unit itself. Homeowners would receive the equipment at no upfront cost, including a Span electrical panel, battery backup and the XFRA unit. Span has said homeowners could receive lower electricity and internet costs in exchange for hosting the equipment.

This is not intended to replace giant data centers. Training advanced AI systems still requires enormous centralized facilities. Instead, the home units would handle inference, the part of AI that responds to questions, generates images or powers AI features in apps and devices. Because the processing happens closer to the user, response times could improve.

The challenges are easy to see. Homeowners will want answers about noise, heat, maintenance, security and liability. Utilities will need to determine whether these systems truly use spare residential capacity or simply shift commercial demand into neighborhoods. Cities and regulators may also question whether backyard AI units should be treated like home appliances (as easy as installing Google Fiber) or commercial infrastructure, which would involve a lot of red tape.

Then there’s the consumer question. Uber worked because drivers could understand the tradeoff: use your car, make extra money. Distributed data centers will need a similarly clear value proposition. How much would you actually save? What happens during a power outage? Who maintains the equipment?

So, would you install a micro-data center at your home if the equipment was free and your monthly utility bills dropped?

Leslie Meredith has been writing about technology for more than a decade. As a mom of four, value, usefulness and online safety take priority. Have a question? Email Leslie at asklesliemeredith@gmail.com.

Starting at $4.32/week.

Subscribe Today