Today: How the hyperscalers are adapting their data-center design strategies as demand for AI workloads, electricity, and water takes off, Google's quantum-computing "breakthrough," and the latest funding rounds in enterprise tech.
Today: A look at some of re:Invent 2024's most important new products and services that cloud buyers will be tracking over the next year, and the quote of the week.
Today: Amazon CTO Werner Vogels closes AWS re:Invent 2024 with advice on how to make complex things simple, OpenAI unveils an absurdly expensive subscription plan for its latest model, and the latest enterprise moves.
Why GitHub Copilot needed a few new flight engineers from Anthropic and Google
GitHub Copilot users will be able to swap in AI models from Anthropic and Google in place of the default models from OpenAI. When the flagship product of the generative AI era takes such a step, it's a sign OpenAI's leadership position is waning.
GitHub Copilot is the shining star of the generative AI boom, perhaps the most widely used AI enterprise tool among businesses around the world. Microsoft and its GitHub subsidiary spent the last several years promoting their special relationship with OpenAI and its large-language models as the secret sauce behind the coding assistant, but times have changed.
GitHub Copilot users will be able to swap in AI models from Anthropic and Google to generate answers when using Copilot Chat to ask questions, GitHub CEO Thomas Dohmke announced Tuesday at GitHub Universe. "It is clear the next phase of AI code generation will not only be defined by multimodel functionality, but by multimodel choice," he said in a blog post.
Anthropic's Claude 3.5 Sonnet — which has gained a lot of traction as a coding assistant this year — is available today, while Copilot users will be able to select Google's Gemini 1.5 Pro in "the coming weeks," according to GitHub. Developers will be able to continue using several models from OpenAI, including GPT-4o, o1-preview, and o1-mini.
“We truly believe that the era of a single model is over,” Dohmke told TechCrunch, citing the tradeoffs that Copilot customers need to make between latency and accuracy.
It's not clear from Tuesday's presentation if other models, such as Meta's Llama, will eventually make their way into Copilot, but now that GitHub has built the ability to switch models into the tool it's not hard to imagine it offering several other choices at a later date.
During the early days of their partnership, Microsoft and GitHub sang the praises of OpenAI's technology and actively encouraged the rest of enterprise tech to think their exclusive access to OpenAI put them way ahead of competitors. But as the pace of OpenAI's model breakthroughs has slowed, rivals like Anthropic, Google, Meta, and others have quickly managed to erase much of that advantage.
Simply providing exclusive access to OpenAI's models was enough to jumpstart Microsoft's cloud AI business, but the real enterprise tech competition has shifted to the vendors that build the best tools and platforms that their customers need to build their own AI apps atop those models. Over the last year, Microsoft has also made several other models available through its Azure AI service in a strategy that more closely resembles AWS's approach with Bedrock.
GitHub has always had an independent streak within the Microsoft universe, as COO Kyle Daigle told Runtime last year at AWS re:Invent. But Tuesday's announcement makes it clear that if the company behind one of the most popular generative AI tools on the planet thinks it can no longer afford to rely entirely on OpenAI, nobody can.
"We, at GitHub, believe in developer choice and that developers — for reasons of company policy, benchmarks that they have seen, different programming languages and of course, personal preference, or because they’re using that model for other scenarios already — prefer one of the competing models, and so we’re officially partnering with both Anthropic and Google," Dohmke told TechCrunch.
(This post originally appeared in the Runtime newsletter on Oct. 29th, sign up here to get more enterprise tech news three times a week.)
Tom Krazit has covered the technology industry for over 20 years, focused on enterprise technology during the rise of cloud computing over the last ten years at Gigaom, Structure and Protocol.
Openings for tech jobs regularly generate an overwhelming number of applications, partly because remote work is more common and AI tools are automating the process. That creates a significant burden on both hiring organizations and would-be employees.
After Cockroach Labs announced earlier this month that CockroachDB would switch to a proprietary model, Oxide Computer Company decided to take a unique approach to preserving its investments in Cockroach's open-source software.
There are lots of companies interested in generative AI apps with money but limited skills. They'll need helpful platform tools to get up and running, and competition in this category could set the tone for the enterprise AI era.
"This might be the best executed supply chain attack we've seen described in the open, and it's a nightmare scenario." There's no real plan to prevent the next one.