Today: Microsoft shores up its AI strategy heading into a pivotal year, Meta is getting into the AI SaaS business with the former leader of Salesforce's AI division, and the latest enterprise funding.
Today: OpenAI would rather ChatGPT users spend more time using its tool than other "copilots," HPE rolls out a new supercomputer design, and the quote of the week.
Today: Why enterprise open-source contributors might be the secret weapon against patent trolls, AI models are starting to run into scaling problems, and the latest enterprise moves.
Why AI app development tools might be more important than the LLM wars
There are lots of companies interested in generative AI apps with money but limited skills. They'll need helpful platform tools to get up and running, and competition in this category could set the tone for the enterprise AI era.
SEATTLE — While consumer-oriented AI devices and apps still feel like a solution looking for a problem, business customers have different needs and expectations when interacting with their suppliers. Now that companies have had over a year to kick the tires on enterprise AI app-building services, this could be the year that businesses who preferred to keep those initial experiments in-house gain the confidence to change how they interact with customers.
That's according to Microsoft's John Montgomery, corporate vice president for program management in Azure AI. "I would say a year ago, most of the builds that we were seeing were inward facing. Now we're increasingly seeing ones … where they are very much customer-facing," he said in an interview this week at Microsoft Build.
Lots of companies have been talking about using generative AI technology to summarize meetings or review contracts, a scenario where if something goes wrong it's correctable and there's no impact on a customer. But Vodafone recently used Azure OpenAI to improve its TOBi virtual assistant, and it has been able to quickly understand which issues customers need resolved 50% better than an earlier version.
H&R Block is currently working with Azure OpenAI to build a new service that could "basically take your shoebox of receipts and forms and scan them in and extract the info, or take the PDFs and extract all the relevant information and populate your tax forms," Montgomery said.
However, companies like Vodafone and H&R Block "are the most sophisticated customers," he said. If Microsoft expects to see its AI investments pay off, it will need to get the rest of the corporate world to roll out external-facing AI apps without fear of damaging their relationships with customers.
Early generative AI adopters preferred to work directly with the foundation models and manage everything themselves, Montgomery said, sort of akin to how early cloud infrastructure adopters got started 15 years ago with basic compute and storage services. But there are a lot of companies with plenty of money that lack the skills required to handle that kind of chore effectively, and are willing to pay their infrastructure provider to do more of the heavy lifting.
Now that Azure AI Studio — which does a lot of that heavy lifting — is generally available as of this week at Build, Microsoft expects the companies that are interested in AI but need help building safe and reliable apps to enter the chat. Azure AI Studio also makes it easier for companies to work with several different models as part of their application, Montgomery said, which could make it easier for mere mortals to adopt low-cost or open-source foundation models.
Microsoft had "a couple hundred" customers using AI in production in January 2023, when Azure Open AI became generally available, Montgomery said. Now it has 53,000 customers using its AI services across Azure, which is incredible growth but a relatively small portion of its business: for example, Satya Nadella said on its last earnings call that 330,000 Microsoft customers used AI tools across its Power Platform.
As foundation models mature and competition to OpenAI gets real, it seems likely that a lot of enterprise AI customers will choose their AI provider based on the strength of the tools they can use to create AI applications. Microsoft has a strong position with developers thanks to widely used tools like GitHub and Visual Studio Code, and will need to build, maintain, and improve similar tools for AI developers to keep its advantage.
(This post originally appeared in the Runtime newsletter on May 23rd, sign up here to get more enterprise tech news three times a week.)
Tom Krazit has covered the technology industry for over 20 years, focused on enterprise technology during the rise of cloud computing over the last ten years at Gigaom, Structure and Protocol.
GitHub Copilot users will be able to swap in AI models from Anthropic and Google in place of the default models from OpenAI. When the flagship product of the generative AI era takes such a step, it's a sign OpenAI's leadership position is waning.
Openings for tech jobs regularly generate an overwhelming number of applications, partly because remote work is more common and AI tools are automating the process. That creates a significant burden on both hiring organizations and would-be employees.
After Cockroach Labs announced earlier this month that CockroachDB would switch to a proprietary model, Oxide Computer Company decided to take a unique approach to preserving its investments in Cockroach's open-source software.
"This might be the best executed supply chain attack we've seen described in the open, and it's a nightmare scenario." There's no real plan to prevent the next one.