Snowflake CEO Sridhar Ramaswamy: We are going to move faster

In an interview at the Data Cloud Summit, Ramaswamy described how enterprise customers are working with generative AI, outlined growth opportunities for Snowflake's future, and lamented the "insular" culture at Google that denied it the opportunity to lead the generative AI transition.

Snowflake CEO Sridhar Ramaswamy speaks on stage at the 2024 Snowflake Data Cloud Summit.
Snowflake CEO Sridhar Ramaswamy speaks on stage at the 2024 Snowflake Data Cloud Summit. (Credit: Snowflake)

Following a decade-long meteoric rise as a modern data warehouse darling, Snowflake went through a period of profound change over the last year. The generative AI boom revealed holes in its product strategy, the stock market soured on what was the best software IPO in history when it debuted in 2020, and Sridhar Ramaswamy joined the company.

Ramaswamy, who spent more than 15 years at Google, came to Snowflake through the acquisition of his AI search startup Neeva in May 2023 and immediately updated his LinkedIn profile with his new job description: "Learning Snowflake." Eight months later he was CEO, replacing enterprise software veteran Frank Slootman in what the company made clear was a pivot toward a "technologist" with AI expertise.

Two weeks ago at Snowflake Data Cloud Summit 2024, Ramaswamy officially turned the company into a storage-neutral vendor by extending full support for Apache Iceberg, a storage format that lets customers keep their data wherever they like rather than paying Snowflake for storage. Snowflake will lose some storage revenue in order to make this move, but it's clearly where the market is going and a strategy that its archrival Databricks has pursued for years.

In an interview at the Data Cloud Summit, Ramaswamy described how enterprise customers are working with generative AI, outlined growth opportunities for Snowflake's future, and lamented the "insular" culture at Google that denied it the opportunity to lead the generative AI transition.

This interview has been edited and condensed for clarity.

In your address on Monday, you said that the bar is higher for enterprise AI, which certainly rings true in my experience when talking to your customers. Why is that? And if the bar is higher for them, why are you and basically every other enterprise tech vendor urging them to get on board with this sooner rather than later?

I think it reflects the fact that pretty much every senior executive, every CEO, every board, and most of all, everybody, understands that there's something cool about it. It has a feel for the early internet, the early smartphone revolution. But I think what's different this time around is also that this is now, what, the fourth or fifth platform shift in tech; I think companies are also a lot more wary of not getting disrupted by AI. And so I see this as the collective intelligence of people understanding that these things can drive potentially tectonic changes.

By the way, I've talked about how we have 700 use cases making their way to production at Snowflake, quite a lot of them are customer initiated. A bunch are initiated by us having conversations, but more than a majority is customer initiated. This speaks to the excitement and the competitive pressure that they are feeling, and our value add is in demystifying this stuff. 

Did you consider buying Tabular?

We considered a number of options. At the end of the day, we decided to build the functionality in the company that was bought. That's the Polaris Catalog.

Remember, Iceberg is an open source format. It has contributors from Apple. It has contributors from Netflix. It's a broad-based community of open-source developers. And with CSPs supporting it, whether it is AWS, [Microsoft] Azure or GCP, all three of them are supporting this format. I think there is a guarantee that the format itself will stay open. That's a positive thing.

The second part is the cloud catalog, and we decided to go with the build-it-from-scratch approach, which is what Polaris was. It's going to be in public preview within the next few weeks. There are customers that are actively testing it out, so we'll have more to say about that in a few weeks. But it's really this combination of support for the open format [and] support for an open catalog that we think is a big step forward for just the whole industry in terms of data interoperability.

You've been in this job for three months; what have you learned about the company or the industry in the role that perhaps you didn't know coming into it?

I was at dinner with some of our biggest customers yesterday, and the faith, the trust that they have in Snowflake, the excitement that they feel about how we are leaning forward has been great. I knew some of this before, but I think that one-on-one reinforcement has been amazing to see. 

Over 15,000 people attended the Data Cloud Summit, where Ramaswamy spoke with Nvidia CEO Jensen Huang from Computex in Taiwan.

In terms of what we can absolutely internally do better — and I'm very open about this — I think getting products to market faster. Having the right investments in things like stability, problem detection, so that you can push things out faster, so you have a sense of urgency. I think you saw that get reflected in the summit keynote.

And then the third part is, how do we figure out scaled processes for taking these products and creating value with our customers? We have independent efforts in AI, in data engineering, in a number of areas, and I think I'm really pressing the gas hard on how to make that happen.

How do you see the company evolving? There's two different ways companies can evolve once they get to your size: You can go really deep into your core business or you can branch out, and add other things that are data adjacent. You introduced an observability tool this week, and a lot of people I've talked to see observability as a launching point to security. Do you want to become a broader enterprise vendor?

Our aspiration, first and foremost, is to be the strongest data platform. That is the core strength of the company, and that is where the majority of our efforts will go. I think that has incredible TAM.

Having said that, there is leverage to be had and dollars to be made with applications that target different kinds of users, different kinds of enterprise needs. We are taking a partnership approach. There are companies that build on top of Snowflake, companies like Blue Yonder, State Street has an effort, [JPMorgan Chase] has an effort. We think that is a more effective way for us to bring additional functionality to our customers in ways that leverage the strength of Snowflake.

There will be areas — AI, for example — where we are making direct investments because we think that providing our customers with a really easy way to get at their data is not something that Snowflake should say other people are going to do. Other people will do it, but we think it's important to have a first-party answer. But we pick those carefully.

But you do think it's important to get into the model wars?

I think of language models as a foundational capability. Twenty years ago, if you wanted to be a cloud company, knowing how to run data computations at scale, you learned how to buy and use Hadoop. That was a foundational skill, because that was the only way you could do large-scale data processing, beyond what you could do from a database.

A few years later being really, really good at what were then called Web 2.0 apps, the likes of Gmail, that was a required capability for any team that made products. If you wanted to reach customers, you better know how to write a damn good web app like Gmail; that became the standard.

I think competency in language models for a data company [needs] to be a core competency. That's the reason why we have a foundation model team and an inference team and an application team. We think of these as the building blocks for any data platform.

I hope that language models do not go down the path where it takes a billion dollars to have a credible language model, because that means that only three, four, five companies in the world would do it.

Having said that, like we're not invested like Open AI or Microsoft, they literally spend billions of dollars. And to be honest, I hope that language models do not go down the path where it takes a billion dollars to have a credible language model, because that means that only three, four, five companies in the world would do it. I would love for there to be more sort of broad-based language model innovation and research with companies like Snowflake participating, but we are very comfortable with where we are. We are very competent at foundation models, without wanting to be in this rat race that can cost billions of dollars.

How did your time at Google inform what you're doing now?

I've run systems that made $100 billion a year, okay? There's a certain level of just incredibly high competence [at Google], that's the caliber of that team. And I will joke to people I have lived through so many operational disasters when you do a job for 15 years, including stuff like there was some hunter dude that shot overland fiber with buckshot and brought down connectivity between Oregon and Atlanta. We were like crippled for six weeks. You just get a certain confidence and swagger when you're like, "Okay, I know how to do this at scale." The problem is different, but I bring that wealth of experience.

Similarly on the business side, the ads model is a consumption model. If some advertiser doesn't spend, you don't make money. And so I have been through probably a dozen, two dozen revenue crises where it's like, "how do we tune this system? How do we launch new things? How do we talk to advertisers to get them to adopt new stuff?" Those are the things that you need to do to drive predictable growth, both when things are great, but especially when things are not so great, when there is a little dip in the external world. What can you do to compensate?

So I would say those things, both from a product and the business perspective, are running a scale team. My team itself, of engineers and product managers, was over 10,000 people, the ads sales team was over 15,000 people. That's an army. That's 25,000 people. How do you wield them into a business that can generate massive amounts of revenue? That skill is priceless.

When it comes to this era that we're in right now, obviously a lot of this foundational AI technology was developed at Google but yet brought to the world through different companies. Why do you think that happened?

Google was insular through much of the 2000s. Google's sincere business belief was that it made way more sense for Google to use its infrastructure to power the incredibly profitable businesses that search and ads were, there was not a lot of desire to even be a player in the Google Cloud. Google would not even famously publish papers; it took me becoming SVP of Google Ads before I let my team publish papers and we published a series of celebrated papers on data.

I think it took 10-plus years for Google to understand [the benefits of] interacting with the external world, whether it's the developer ecosystem or being a platform for other developers to then develop. It took a long time to come, and by the time Facebook had been established and they open sourced a lot of software that they built. Many of the archetypes that we use in cloud computing today came from Facebook, not from Google. They came from AWS, then taking these things and running them at scale.

The lesson there is when you run a business with, like, the best margin on the planet ever — which is the search ads business — every other business looks less attractive.

Great! You’ve successfully signed up.

Welcome back! You've successfully signed in.

You've successfully subscribed to Runtime.

Success! Check your email for magic link to sign-in.

Success! Your billing info has been updated.

Your billing was not updated.