Today: OpenAI reveals its plan to build $500 billion worth of data centers in the U.S., why Oracle's decision to keep TikTok running could backfire down the road, and the latest funding rounds in enterprise tech.
Today on Product Saturday: Nvidia and Snowflake try to get more enterprises on the AI train by focusing on safety and costs, and the quote of the week.
Today: Visual Studio Code fueled Microsoft's decade-long enterprise winning streak, but new challenges loom, why Google and Microsoft are forcing you to use their AI tools, and the latest enterprise moves.
Why GitHub Copilot needed a few new flight engineers from Anthropic and Google
GitHub Copilot users will be able to swap in AI models from Anthropic and Google in place of the default models from OpenAI. When the flagship product of the generative AI era takes such a step, it's a sign OpenAI's leadership position is waning.
GitHub Copilot is the shining star of the generative AI boom, perhaps the most widely used AI enterprise tool among businesses around the world. Microsoft and its GitHub subsidiary spent the last several years promoting their special relationship with OpenAI and its large-language models as the secret sauce behind the coding assistant, but times have changed.
GitHub Copilot users will be able to swap in AI models from Anthropic and Google to generate answers when using Copilot Chat to ask questions, GitHub CEO Thomas Dohmke announced Tuesday at GitHub Universe. "It is clear the next phase of AI code generation will not only be defined by multimodel functionality, but by multimodel choice," he said in a blog post.
Anthropic's Claude 3.5 Sonnet — which has gained a lot of traction as a coding assistant this year — is available today, while Copilot users will be able to select Google's Gemini 1.5 Pro in "the coming weeks," according to GitHub. Developers will be able to continue using several models from OpenAI, including GPT-4o, o1-preview, and o1-mini.
“We truly believe that the era of a single model is over,” Dohmke told TechCrunch, citing the tradeoffs that Copilot customers need to make between latency and accuracy.
It's not clear from Tuesday's presentation if other models, such as Meta's Llama, will eventually make their way into Copilot, but now that GitHub has built the ability to switch models into the tool it's not hard to imagine it offering several other choices at a later date.
During the early days of their partnership, Microsoft and GitHub sang the praises of OpenAI's technology and actively encouraged the rest of enterprise tech to think their exclusive access to OpenAI put them way ahead of competitors. But as the pace of OpenAI's model breakthroughs has slowed, rivals like Anthropic, Google, Meta, and others have quickly managed to erase much of that advantage.
Simply providing exclusive access to OpenAI's models was enough to jumpstart Microsoft's cloud AI business, but the real enterprise tech competition has shifted to the vendors that build the best tools and platforms that their customers need to build their own AI apps atop those models. Over the last year, Microsoft has also made several other models available through its Azure AI service in a strategy that more closely resembles AWS's approach with Bedrock.
GitHub has always had an independent streak within the Microsoft universe, as COO Kyle Daigle told Runtime last year at AWS re:Invent. But Tuesday's announcement makes it clear that if the company behind one of the most popular generative AI tools on the planet thinks it can no longer afford to rely entirely on OpenAI, nobody can.
"We, at GitHub, believe in developer choice and that developers — for reasons of company policy, benchmarks that they have seen, different programming languages and of course, personal preference, or because they’re using that model for other scenarios already — prefer one of the competing models, and so we’re officially partnering with both Anthropic and Google," Dohmke told TechCrunch.
(This post originally appeared in the Runtime newsletter on Oct. 29th, sign up here to get more enterprise tech news three times a week.)
Tom Krazit has covered the technology industry for over 20 years, focused on enterprise technology during the rise of cloud computing over the last ten years at Gigaom, Structure and Protocol.
Today: OpenAI reveals its plan to build $500 billion worth of data centers in the U.S., why Oracle's decision to keep TikTok running could backfire down the road, and the latest funding rounds in enterprise tech.
Today: Visual Studio Code fueled Microsoft's decade-long enterprise winning streak, but new challenges loom, why Google and Microsoft are forcing you to use their AI tools, and the latest enterprise moves.
Visual Studio Code is a vital pieces of Microsoft's enterprise strategy, which banks on the goodwill developers have for both products to drive business to Azure and its other enterprise software products. But software development practices and preferences are changing rapidly.
AI use cases are becoming more powerful and pervasive across the software delivery lifecycle, but adopting any new technology comes with some risks. Nine members of our Roundtable discussed how technology leaders can reap the benefits of AI software-development tools while avoiding the pitfalls.