OpenAI cuts out the middleman; HPE gets Cray Cray

Today: OpenAI would rather ChatGPT users spend more time using its tool than other "copilots," HPE rolls out a new supercomputer design, and the quote of the week.

A data-center worker in a hard hat installs a blade server into a server rack.
(Credit: HPE)

Welcome to Runtime! Today: OpenAI would rather ChatGPT users spend more time using its tool than other "copilots," HPE rolls out a new supercomputer design, and the quote of the week.

(Was this email forwarded to you? Sign up here to get Runtime each week.)


Ship it

This garden has walls: One could argue the entire generative AI revolution consists mostly of companies trying to come up with a unique "wrapper" for ChatGPT. OpenAI also appears to be coming to that conclusion, and introduced new features to its desktop apps this week to entice users to stick around.

Paid users of ChatGPT on MacOS can now generate code and launch popular software development tools like VS Code and Apple's Xcode directly from the ChatGPT desktop app with their new code included. "Alexander Embiricos, product lead with the ChatGPT desktop team, said one of the biggest user behaviors the company saw was copy-pasting text or code generation with ChatGPT to other applications," VentureBeat reported.

Engine engine number 9: It's been a decade since Microsoft released the open-source version of its .NET software-development framework, which accelerated its journey from "open source is a cancer" to a truly accepted sponsor of the open-source movement. While there are a lot of frameworks available to developers ten years later, .NET continues to be widely used and this week Microsoft released .NET 9.

"The Server GC has been significantly changed to be adaptive to application memory requirements as opposed to the resources (memory and CPU) available in the environment (machine/VM/container)," Microsoft said, which means it will work better in "high core-count environments." Believe it or not, the new release also makes it easier for developers to add AI into their applications.

Supercomputers for super problems: HPE released the latest update to the legendary Cray lineup of supercomputers this week, giving buyers of high-performance computers something new to look at. The Cray Supercomputing EX4252 model is a one-rack unit blade server with eight AMD Epyc processors, and HPE said it was designed for "research institutions entrusted with solving the world’s biggest problems," which usually means places like the U.S. Department of Energy.

A different model targeted at AI customers will come with four of Nvidia's Blackwell GPUs, which will be trickling out to the end-user community over the next few months. The new systems are priced at "if you have to ask, you can't afford it."

Steering the ship: Dozens of companies announced new products, services, and enhancements around Kubernetes this week during KubeCon, and Pulumi was no exception. The infrastructure-as-code provider released a new version of its Pulumi Kubernetes Operator, which allows users to manage resources running directly in Kubernetes alongside other cloud resources.

It also improved the way that users can manage secrets across Kubernetes clusters, automating a process that previously required manual intervention. Pulumi is in an interesting position for companies that want to manage their infrastructure using code, as IBM waits to acquire HashiCorp and the OpenTofu fork fractures that community.

Platform power: Akamai was another company angling for a piece of the Kubernetes crowd this week, unveiling a new application development platform. The Akamai App Platform promises to let developers automatically launch their applications on Kubernetes without having to go through a lengthy configuration process.

Akamai is also in an interesting position as it slowly builds up services for cloud infrastructure and now platform development, challenging the Big Three (and Oracle, I guess) for cloud business. And given the slowdown in its core content-delivery market, this part of its business could become a much bigger part of Akamai's aspirations over the next several years.


Stat of the week

"Executives are all in on AI, with 99% planning AI investment in the coming year," according to a new survey conducted by Slack. That's certainly not surprising, but "excitement around AI is … cooling, dropping six percentage points (47% to 41%) among the overall global population" of everyday workers who participated in the survey, suggesting that inside a lot of companies, generative AI is still a solution looking for a problem.


Quote of the week

"We’re a regulatory compliance organization. In other words, we are the roadies and the janitors to the rock star engineers who create this code. Part of being a roadie is dealing with patent trolls." — Linux Foundation executive director Jim Zemlin, discussing its new strategy for dealing with patent trolls and redefining the concept of a roadie in one fell swoop.


The Runtime roundup

Salesforce had a hell of a Friday: It suffered a widespread, hours-long outage to some of its core services, and Clara Shih, the face of its AI efforts, has apparently left the company according to a report from Patrick Walravens of JMP Securities that was confirmed by CIO.

Microsoft gave U.S. government agencies a free year of its G5 package assuming they would find it too difficult to switch providers and decide to pay for that more expensive tier of services on an ongoing basis, according to a detailed report by ProPublica.


Thanks for reading — see you Tuesday!

Great! You’ve successfully signed up.

Welcome back! You've successfully signed in.

You've successfully subscribed to Runtime.

Success! Check your email for magic link to sign-in.

Success! Your billing info has been updated.

Your billing was not updated.