Today: Liberty Mutual CIO Monica Caldas explains how the insurance company quickly rolled out an internal generative AI app, former OpenAI CTO Mira Murati surfaces with a new company, and the latest funding rounds in enterprise tech.
LibertyGPT is an internal application that is currently being used by more than 10,000 Liberty Mutual employees to summarize information and answer common questions. An early version was built in just two weeks thanks to previously established data piplines and cost controls.
How Liberty Mutual was able to jump into generative AI thanks to a clear data strategy and FinOps
LibertyGPT is an internal application that is currently being used by more than 10,000 Liberty Mutual employees to summarize information and answer common questions. An early version was built in just two weeks thanks to previously established data piplines and cost controls.
Long before the generative style hit the scene, insurance companies were using AI to model risk and detect fraud. But while businesses continue to struggle to implement generative AI apps in production, Liberty Mutual's 2023 decision to build an internal version of ChatGPT for its employees was made easier thanks to the data foundation and spending-management tools it had already built.
"We've had this scaffolding of, how do you build models? How do you actually test the models? What is the data pipeline for that? How do you use internal sources of data to improve your model accuracy?" said Monica Caldas, CIO of Liberty Mutual, in a recent interview. "All these questions of governance have been things that we've been thoughtful about for a long time."
LibertyGPT is an internal application that is currently being used by more than 10,000 Liberty Mutual employees to summarize information and answer common questions. An early version was built in just two weeks after Microsoft made OpenAI's LLMs available to enterprises and since then, Liberty Mutual has expanded the app to include several different AI models and handle the unique needs of different departments across the company.
Everybody wants to use the Ferrari, but if you're just going down the street, you might just need a bike.
There were several factors beyond prior AI expertise that allowed Liberty Mutual to launch its generative AI app quickly and refine it in production. When ChatGPT dropped in late 2022, the company had already migrated most of its workloads to the cloud and implemented FinOps, the cloud cost-management discipline that is taking on new importance in the generative AI era given the current cost of inference.
"Everybody wants to use the Ferrari, but if you're just going down the street, you might just need a bike," Caldas said. The company built a "FinOps for AI" team across the technology and finance organizations that creates models based on proposed additions to LibertyGPT and tries to evaluate how much the app will cost to operate as its mandate expands.
Right now, LibertyGPT is serving just 10 use cases and it's easy for the company to track its spending, but Caldas expects that to change as the app scales across the entire 40,000 person company. And she thinks that process will become more complex as generative AI vendors experiment with different pricing strategies, such as moving from traditional subscription-based pricing to consumption-based pricing.
Liberty Mutual also set up an AI committee in 2023 with representatives from tech, legal, and HR to help it "balance the defensive and offensive side" of building around generative AI, Caldas said. The company has reams of sensitive data about its customers that needs to be protected, but "we're also not going to be afraid and lock everything down and not allow people to experiment and innovate and think about how to solve problems in a new way," she said.
After the initial launch, by the middle of last year Liberty Mutual was ready to start refining how the company uses LibertyGPT. It built a "model hub" to move beyond OpenAI's models and allow users to pick from several different options, developed one summarization workflow so employees wouldn't "recreate summarization engines all over the company," and centralized its source code repository to allow engineers to move faster, Caldas said.
We are not taking away from our journey to the cloud in order to go fund AI.
But she emphasized that the company's fiscal discipline around technology spending is what really allowed it to quickly embrace generative AI tools.
After years of throwing money at technology projects, big businesses around the world tightened the reins amid inflation concerns and rising interest rates over the last two years, and Liberty Mutual was no exception. But the pressure to embrace generative AI tools has created a problem for CIOs who still need to keep their entire pre-existing digital operation up and running with a flat or slowly growing budget, and Microsoft's second-quarter earnings results showed that some of them are spending less on non-AI technology in order to experiment with generative AI.
"We have taken our budget down about 4%. We created capacity" for projects like LibertyGPT by reducing the company's application footprint by 20%, Caldas said. "We are not taking away from our journey to the cloud in order to go fund AI."
Tom Krazit has covered the technology industry for over 20 years, focused on enterprise technology during the rise of cloud computing over the last ten years at Gigaom, Structure and Protocol.
Today: Liberty Mutual CIO Monica Caldas explains how the insurance company quickly rolled out an internal generative AI app, former OpenAI CTO Mira Murati surfaces with a new company, and the latest funding rounds in enterprise tech.
It was only in the last six months that Canva decided generative AI coding assistants were good enough for its employees. It got there through a period of trial and error that suggests GenAI vendors need more flexible pricing strategies.
Figma's collaboration tools are a hit with designers thanks to its decision to take a page from the gaming software playbook and rebuild its databases for "infinite scale."
Like many companies that have grown through acquisitions over the years, Rajesh Naidu's job involves integrating those acquisitions onto a common tech stack, which requires taking a hard look at the SaaS applications used by those companies.