Today: the quantum computing hype train leaves the station once again, this time with Microsoft in the driver's seat, multiple reports outline DOGE's control over computing infrastructure formerly run by official government agencies, and the latest enterprise moves.
Today: Liberty Mutual CIO Monica Caldas explains how the insurance company quickly rolled out an internal generative AI app, former OpenAI CTO Mira Murati surfaces with a new company, and the latest funding rounds in enterprise tech.
LibertyGPT is an internal application that is currently being used by more than 10,000 Liberty Mutual employees to summarize information and answer common questions. An early version was built in just two weeks thanks to previously established data pipelines and cost controls.
Why AI app development tools might be more important than the LLM wars
There are lots of companies interested in generative AI apps with money but limited skills. They'll need helpful platform tools to get up and running, and competition in this category could set the tone for the enterprise AI era.
Scott Guthrie, executive vice president of Microsoft's Cloud and AI Group, explains its approach to AI app development at Build 2024. (Credit: Dan DeLong for Microsoft)
SEATTLE — While consumer-oriented AI devices and apps still feel like a solution looking for a problem, business customers have different needs and expectations when interacting with their suppliers. Now that companies have had over a year to kick the tires on enterprise AI app-building services, this could be the year that businesses who preferred to keep those initial experiments in-house gain the confidence to change how they interact with customers.
That's according to Microsoft's John Montgomery, corporate vice president for program management in Azure AI. "I would say a year ago, most of the builds that we were seeing were inward facing. Now we're increasingly seeing ones … where they are very much customer-facing," he said in an interview this week at Microsoft Build.
Lots of companies have been talking about using generative AI technology to summarize meetings or review contracts, a scenario where if something goes wrong it's correctable and there's no impact on a customer. But Vodafone recently used Azure OpenAI to improve its TOBi virtual assistant, and it has been able to quickly understand which issues customers need resolved 50% better than an earlier version.
H&R Block is currently working with Azure OpenAI to build a new service that could "basically take your shoebox of receipts and forms and scan them in and extract the info, or take the PDFs and extract all the relevant information and populate your tax forms," Montgomery said.
However, companies like Vodafone and H&R Block "are the most sophisticated customers," he said. If Microsoft expects to see its AI investments pay off, it will need to get the rest of the corporate world to roll out external-facing AI apps without fear of damaging their relationships with customers.
Early generative AI adopters preferred to work directly with the foundation models and manage everything themselves, Montgomery said, sort of akin to how early cloud infrastructure adopters got started 15 years ago with basic compute and storage services. But there are a lot of companies with plenty of money that lack the skills required to handle that kind of chore effectively, and are willing to pay their infrastructure provider to do more of the heavy lifting.
Now that Azure AI Studio — which does a lot of that heavy lifting — is generally available as of this week at Build, Microsoft expects the companies that are interested in AI but need help building safe and reliable apps to enter the chat. Azure AI Studio also makes it easier for companies to work with several different models as part of their application, Montgomery said, which could make it easier for mere mortals to adopt low-cost or open-source foundation models.
Microsoft had "a couple hundred" customers using AI in production in January 2023, when Azure Open AI became generally available, Montgomery said. Now it has 53,000 customers using its AI services across Azure, which is incredible growth but a relatively small portion of its business: for example, Satya Nadella said on its last earnings call that 330,000 Microsoft customers used AI tools across its Power Platform.
As foundation models mature and competition to OpenAI gets real, it seems likely that a lot of enterprise AI customers will choose their AI provider based on the strength of the tools they can use to create AI applications. Microsoft has a strong position with developers thanks to widely used tools like GitHub and Visual Studio Code, and will need to build, maintain, and improve similar tools for AI developers to keep its advantage.
(This post originally appeared in the Runtime newsletter on May 23rd, sign up here to get more enterprise tech news three times a week.)
Tom Krazit has covered the technology industry for over 20 years, focused on enterprise technology during the rise of cloud computing over the last ten years at Gigaom, Structure and Protocol.
Today: the quantum computing hype train leaves the station once again, this time with Microsoft in the driver's seat, multiple reports outline DOGE's control over computing infrastructure formerly run by official government agencies, and the latest enterprise moves.
Today on Product Saturday: The Allen Institute for AI releases an actual open-source challenger to DeekSeek's V3 model, Microsoft open-sources a NoSQL database under an old and familiar name, and the quote of the week.
Today: Microsoft's second-quarter earnings report pointed to an interesting question about the future of enterprise app development, the fallout from DeepSeek's depth charge continues, and the latest enterprise moves.