Mears Broadband is a division of Mears Group, a subsidiary of Quanta Services that provides construction services for fiber broadband deployment projects. Mears Broadband President Trent Edwards says the construction costs for a fiber deployment will account for about 70% of the total deployment cost.
He said that percentage has been historically true and continues to be accurate as scores of companies gear up for Broadband Equity Access and Deployment (BEAD) funds that will support new fiber projects across the U.S.
Asked if many newcomers to the fiber broadband space are aware of the 70% cost, Edwards said, “not really.”
He also warns that it’s really important to engage with construction companies such as Mears early to ensure that projects stay on schedule and on budget.
Edwards said there are a lot of people getting involved in this “Great Build” that is about to take place that have never historically been involved in building broadband networks. Among the novices, he includes many of the folks who have only recently been placed in roles at state broadband offices, along with the small staffs at local permitting offices who may not realize what’s about to hit them.
The biggest reason many areas in the U.S. remain unserved or underserved by broadband until now is because private companies couldn’t make a business case to reach these areas. And even though BEAD funds will help subsidize that business case and private equity companies also want to invest, companies will still want to crunch all the numbers to make sure they get a decent return on their investments.
Sustainability. If you are in IT—either as an IT executive or as a solutions provider selling into IT—this is a term you are learning to embrace. ESG has become a major focus for virtually every enterprise, meaning that major contributors to ESG efforts (like, ahem, IT) are being measured on how well they are aiding a company’s sustainability efforts—in this case, the reduction of organization-wide power consumption. If your organization hasn’t yet jumped on the ESG bandwagon, get ready. Whether through free will or regulation, an ESG initiative is coming your way.
In this post, I’ll discuss the challenges I’ve been hearing about from IT executives across many industries, and spotlight an interesting solution I recently had the chance to preview.
Do more with less power
For many businesses, IT is a unit that contributes significantly to overall power consumption—and one that’s being tasked with reducing that power footprint. IBM recently conducted a study that showed that datacenters consume about 1% of the world’s power—about as much as the entire power consumption of Australia. The country. In the not-too-distant future, that number is projected to climb to 3%.
Seemingly at odds with sustainability initiatives are the digital transformation projects and edge computing expansions that IT executives are also tasked with implementing. They’re told to employ a cloud operating model and deploy micro datacenters everywhere along the edge so more data can be gathered—data that will be fed into machine learning models so the business can be more innovative and agile. But they must also do so while reducing power consumption by a significant amount. Makes perfect sense, no?
This scenario of contradicting initiatives is what IT executives face every day, leaving many of them scratching their heads and wondering if they are being set up for failure. For those not in IT, it’s important to understand that while power has always been a consideration, it has not always been treated as such a precious commodity to conserve.
It’s also about density
While sustainability is a big deal, there is another challenge that IT executives face, especially as they look to the edge: compute density. 174ZB of data—that’s zettabytes—will be generated in 2025, with well over half of the average company’s data being generated at the edge. This mass of data must be aggregated, transformed and analyzed in real time. And for this scenario to work, edge environments will require more than a couple of tower servers with some hard drives and a gateway. The edge requires rich compute platforms that use many high-end CPUs, GPUs and other accelerators. More than that, these servers are deployed in some of the harshest conditions—on oil rigs, at the base of cellular towers, on the battlefield and so on.
Edge computing is the next frontier for IT; Its power, space and ruggedization barriers will continue to stall many IT projects. If only there were a way to cram a lot of compute into a small package that can also provide the necessary power and cooling. If only.
The answer is two-phase immersion cooling
While those in the datacenter space have been experimenting with various cooling techniques over the years, I think there may be a solution to all the challenges of edge computing described above—power, cooling, space, compute density, ruggedization: two-phase immersion cooling.
Two-phase immersion cooling houses servers in tanks filled with dielectric fluid. That’s right, servers run while they are immersed in fluid. And, yes, it defies conventional thinking. It works because the dielectric fluid removes heat from the server and its components, while not interfering with the server’s functionality. The result is that the fluid turns into a vapor and rises. As the vapor rises, cooling coils turn it back into a fluid, which then returns to a reservoir so the cycle can continue over and over again.
The power savings associated with two-phase cooling seems almost too good to be true. Estimates from Mears Advanced Technology Group (MATG) put two-phase power savings in the 60% range, with overall TCO savings of up to 50%. These are staggering numbers.