Today: Why cloud storage architectures have an enormous impact on generative AI app performance, a vital component of cybersecurity preparedness is in limbo thanks to a cut in federal funding, and the latest funding rounds in enterprise tech.
Today on Product Saturday: Google Cloud outlines a new way for Kubernetes users to run inference on their existing clusters, why IBM thinks its new mainframe is an AI engine, and the quote of the week.
Today: Google Cloud makes its pitch to developers and CIOs as the best place to build enterprise AI apps, the meteoric rise of MCP hits a snag, and the latest enterprise moves.
Why AI app development tools might be more important than the LLM wars
There are lots of companies interested in generative AI apps with money but limited skills. They'll need helpful platform tools to get up and running, and competition in this category could set the tone for the enterprise AI era.
Scott Guthrie, executive vice president of Microsoft's Cloud and AI Group, explains its approach to AI app development at Build 2024. (Credit: Dan DeLong for Microsoft)
SEATTLE — While consumer-oriented AI devices and apps still feel like a solution looking for a problem, business customers have different needs and expectations when interacting with their suppliers. Now that companies have had over a year to kick the tires on enterprise AI app-building services, this could be the year that businesses who preferred to keep those initial experiments in-house gain the confidence to change how they interact with customers.
That's according to Microsoft's John Montgomery, corporate vice president for program management in Azure AI. "I would say a year ago, most of the builds that we were seeing were inward facing. Now we're increasingly seeing ones … where they are very much customer-facing," he said in an interview this week at Microsoft Build.
Lots of companies have been talking about using generative AI technology to summarize meetings or review contracts, a scenario where if something goes wrong it's correctable and there's no impact on a customer. But Vodafone recently used Azure OpenAI to improve its TOBi virtual assistant, and it has been able to quickly understand which issues customers need resolved 50% better than an earlier version.
H&R Block is currently working with Azure OpenAI to build a new service that could "basically take your shoebox of receipts and forms and scan them in and extract the info, or take the PDFs and extract all the relevant information and populate your tax forms," Montgomery said.
However, companies like Vodafone and H&R Block "are the most sophisticated customers," he said. If Microsoft expects to see its AI investments pay off, it will need to get the rest of the corporate world to roll out external-facing AI apps without fear of damaging their relationships with customers.
Early generative AI adopters preferred to work directly with the foundation models and manage everything themselves, Montgomery said, sort of akin to how early cloud infrastructure adopters got started 15 years ago with basic compute and storage services. But there are a lot of companies with plenty of money that lack the skills required to handle that kind of chore effectively, and are willing to pay their infrastructure provider to do more of the heavy lifting.
Now that Azure AI Studio — which does a lot of that heavy lifting — is generally available as of this week at Build, Microsoft expects the companies that are interested in AI but need help building safe and reliable apps to enter the chat. Azure AI Studio also makes it easier for companies to work with several different models as part of their application, Montgomery said, which could make it easier for mere mortals to adopt low-cost or open-source foundation models.
Microsoft had "a couple hundred" customers using AI in production in January 2023, when Azure Open AI became generally available, Montgomery said. Now it has 53,000 customers using its AI services across Azure, which is incredible growth but a relatively small portion of its business: for example, Satya Nadella said on its last earnings call that 330,000 Microsoft customers used AI tools across its Power Platform.
As foundation models mature and competition to OpenAI gets real, it seems likely that a lot of enterprise AI customers will choose their AI provider based on the strength of the tools they can use to create AI applications. Microsoft has a strong position with developers thanks to widely used tools like GitHub and Visual Studio Code, and will need to build, maintain, and improve similar tools for AI developers to keep its advantage.
(This post originally appeared in the Runtime newsletter on May 23rd, sign up here to get more enterprise tech news three times a week.)
Tom Krazit has covered the technology industry for over 20 years, focused on enterprise technology during the rise of cloud computing over the last ten years at Gigaom, Structure and Protocol.
Today on Product Saturday: more companies line up behind MCP, which could simplify generative AI app development, LoftLabs introduces a new way to secure multitenant Kubernetes, and the quote of the week.
Today: How President Trump's incoherent trade policies will put even more of a damper on an already-cooling AI boom, Oracle finally confirms (in private, to customers) that its cloud infrastructure was hacked, and the latest enterprise moves.
Model Context Protocol (MCP) was introduced last November by Anthropic, which called it "an open standard that enables developers to build secure, two-way connections between their data sources and AI-powered tools." After kicking the tires for a few months, vendors are jumping on board.