Today: the quantum computing hype train leaves the station once again, this time with Microsoft in the driver's seat, multiple reports outline DOGE's control over computing infrastructure formerly run by official government agencies, and the latest enterprise moves.
Today: Liberty Mutual CIO Monica Caldas explains how the insurance company quickly rolled out an internal generative AI app, former OpenAI CTO Mira Murati surfaces with a new company, and the latest funding rounds in enterprise tech.
LibertyGPT is an internal application that is currently being used by more than 10,000 Liberty Mutual employees to summarize information and answer common questions. An early version was built in just two weeks thanks to previously established data pipelines and cost controls.
Why GitHub Copilot needed a few new flight engineers from Anthropic and Google
GitHub Copilot users will be able to swap in AI models from Anthropic and Google in place of the default models from OpenAI. When the flagship product of the generative AI era takes such a step, it's a sign OpenAI's leadership position is waning.
GitHub Copilot is the shining star of the generative AI boom, perhaps the most widely used AI enterprise tool among businesses around the world. Microsoft and its GitHub subsidiary spent the last several years promoting their special relationship with OpenAI and its large-language models as the secret sauce behind the coding assistant, but times have changed.
GitHub Copilot users will be able to swap in AI models from Anthropic and Google to generate answers when using Copilot Chat to ask questions, GitHub CEO Thomas Dohmke announced Tuesday at GitHub Universe. "It is clear the next phase of AI code generation will not only be defined by multimodel functionality, but by multimodel choice," he said in a blog post.
Anthropic's Claude 3.5 Sonnet — which has gained a lot of traction as a coding assistant this year — is available today, while Copilot users will be able to select Google's Gemini 1.5 Pro in "the coming weeks," according to GitHub. Developers will be able to continue using several models from OpenAI, including GPT-4o, o1-preview, and o1-mini.
“We truly believe that the era of a single model is over,” Dohmke told TechCrunch, citing the tradeoffs that Copilot customers need to make between latency and accuracy.
It's not clear from Tuesday's presentation if other models, such as Meta's Llama, will eventually make their way into Copilot, but now that GitHub has built the ability to switch models into the tool it's not hard to imagine it offering several other choices at a later date.
During the early days of their partnership, Microsoft and GitHub sang the praises of OpenAI's technology and actively encouraged the rest of enterprise tech to think their exclusive access to OpenAI put them way ahead of competitors. But as the pace of OpenAI's model breakthroughs has slowed, rivals like Anthropic, Google, Meta, and others have quickly managed to erase much of that advantage.
Simply providing exclusive access to OpenAI's models was enough to jumpstart Microsoft's cloud AI business, but the real enterprise tech competition has shifted to the vendors that build the best tools and platforms that their customers need to build their own AI apps atop those models. Over the last year, Microsoft has also made several other models available through its Azure AI service in a strategy that more closely resembles AWS's approach with Bedrock.
GitHub has always had an independent streak within the Microsoft universe, as COO Kyle Daigle told Runtime last year at AWS re:Invent. But Tuesday's announcement makes it clear that if the company behind one of the most popular generative AI tools on the planet thinks it can no longer afford to rely entirely on OpenAI, nobody can.
"We, at GitHub, believe in developer choice and that developers — for reasons of company policy, benchmarks that they have seen, different programming languages and of course, personal preference, or because they’re using that model for other scenarios already — prefer one of the competing models, and so we’re officially partnering with both Anthropic and Google," Dohmke told TechCrunch.
(This post originally appeared in the Runtime newsletter on Oct. 29th, sign up here to get more enterprise tech news three times a week.)
Tom Krazit has covered the technology industry for over 20 years, focused on enterprise technology during the rise of cloud computing over the last ten years at Gigaom, Structure and Protocol.
Today: the quantum computing hype train leaves the station once again, this time with Microsoft in the driver's seat, multiple reports outline DOGE's control over computing infrastructure formerly run by official government agencies, and the latest enterprise moves.
Today: Almost a year after agentic AI became every vendor's North Star, business remains slow, OpenAI clarifies its roadmap, and the latest enterprise moves.
Today: Canva CTO Brendan Humphreys explains why culture is as important to a GenAI rollout as the tools themselves, Elon Musk finds another way to troll Sam Altman, and the latest funding rounds in enterprise tech.