This year marked a turning point for enterprise tech as spending recovered and the economy stabilized following years of rising interest rates and supply-chain disruption. While no one knows what lies ahead, here are five things we thought summed up a pivotal year.
Today: Salesforce continues its agentic AI push, Databricks secures one of the biggest funding rounds in tech history, and the rest of this week's enterprise funding.
Today: Why Microsoft is now an enterprise tech chip designer after years of hinting at such plans, AWS goes to K Street, and this week's enterprise moves.
Welcome to Runtime! Today: Why Microsoft is now an enterprise tech chip designer after years of hinting at such plans, AWS goes to K Street, and this week's enterprise moves.
(Was this email forwarded to you?Sign up here to get Runtime each week.)
Silicon Sound
More than five years ago, Microsoft CEO Satya Nadella signaled pretty strongly that a company long known as a Software Giant was going to start making its own chips: "If I look at the sophistication with which the margin structure of the cloud business is going to change, it’s going to be around how smart are you with silicon,” he said. That silicon has now arrived.
Microsoft designed its own chips in the past for Xboxes and some lower-level data-center tasks, but this is its most substantial effort yet in enterprise tech. It unveiled two new custom processors Wednesday at its Ignite conference, an Arm CPU called Cobalt and an AI accelerator called Maia.
Cobalt was designed for general-purpose workloads, and appears to be Microsoft's answer to Graviton, the Arm server chip AWS first introduced nine months after Nadella made the above remarks.
Microsoft didn't want to get into performance comparisons against Graviton, according to The Verge, but said Cobalt will be "up to 40 percent better than what’s currently in our data centers that use commercial Arm servers.”
Arm server chips like Cobalt generally don't deliver as much top-line horsepower as the latest and greater x86 chips from Intel and AMD, but they are far more energy efficient and generally cheaper to run.
Maia was designed for both AI training and inference. That's a different approach than taken by AWS, which offers cloud customers two AI chips with extremely creative names: Trainium for training workloads and Inferentia for inference workloads.
OpenAI collaborated on the design of Maia with Microsoft and said it plans to use the chip to train future versions of its GPT large-language model.
Microsoft also designed a new rack for servers running Maia that uses liquid cooling, implying that those chips use more power than other options on the market.
While Microsoft has been at the forefront of the generative AI boom this year, it's well behind its cloud rivals when it comes to offering alternatives to Big Chip. In addition to AWS's efforts, Google has also been designing its own AI chips for years and Ampere has found a footing inside Oracle's cloud.
It's not clear how many cloud customers opt for the custom silicon offered by their providers compared to the traditional products from Intel, AMD, and Nvidia, although AWS has claimed 50,000 customers ("a low single-digit percentage of AWS's total customer base") are using Graviton.
But it's definitely clear that cloud customers working on AI projects are looking for alternatives to Nvidia, which has the best product on the market and prices it accordingly.
Microsoft did not announce a time frame for making Maia available to customers, which is a little surprising given the pace at which AI is evolving.
Cobalt won't be available until some time in 2024, and given that it's been two years since AWS announced a new Graviton generation, it wouldn't be a huge shock to see Graviton 4 unveiled in two weeks at re:Invent 2023.
All's fair in the cloud
Believe it or not, Amazon is a prominent backer of the industry groups that are calling shenanigans on Microsoft's licensing practices in Europe and beyond, Bloomberg reported Thursday. It's hoping to persuade governments from choosing Azure for cloud infrastructure services given how heavily they already use Office (sorry, Microsoft 365).
Microsoft has been on the defensive this year in Europe after agreeing to license Microsoft Teams separately from the rest of its workplace productivity suite, and it has faced additional calls to allow its enterprise software customers to run Microsoft software on other clouds without having to pay a substantial premium. It turns out Amazon is funding several "industry associations" that are making those arguments to regulators, although all those associations insist they're free to make their own decisions.
Sapeon, a spinoff of South Korea's SK Group, launched a new AI chip alongside some bold performance claims that weren't exactly backed up by details, according to Reuters.
Tom Krazit has covered the technology industry for over 20 years, focused on enterprise technology during the rise of cloud computing over the last ten years at Gigaom, Structure and Protocol.
Today: Salesforce continues its agentic AI push, Databricks secures one of the biggest funding rounds in tech history, and the rest of this week's enterprise funding.
Today: An interview with AWS AI chief Swami Sivasubramanian, why Amazon held off on deploying Microsoft 365 after last year's security debacle, and the latest enterprise moves.