Hacker News new | past | comments | ask | show | jobs | submit | sahil_singla's comments login

Would love to see how you’re doing warehouse optimization. Is there a demo video I can look at?


That resonates with what we have heard from our customers.


Our belief is that building a good optimization tool is not aligned with Snowflake's interests. Instead they seem to be more focused on enabling new use cases and workloads for their customers (their AI push, for example, with Cortex). On the other hand, helping Snowflake users cut down costs is our singular focus.


We are different from Keebo in the way we approach warehouse optimization. Keebo seems to dynamically change the size of a warehouse - we have found that to be somewhat risky, especially when it's downsizing. Performance can take a big hit in this case. So we've approached this problem in two ways:

1. Route queries to the right-sized warehouse instead of changing the size of a particular warehouse itself. This is part of our dbt optimizer module. This ensures that performance stays within acceptable limits while optimizing for costs.

2. Baselit's Autoscaler optimally manages the scaling out of a multi-cluster warehouse depending on the load, which is more cost effective than upsizing the warehouse.


They are more focused on query optimizations whereas we do warehouse optimizations. We are inclined towards warehouse optimizations due to it being completely hands-off.


cool - they're kinda complimentary?


Espresso does warehouse too so they’re competitors


Yeah, kind of.

Though I'm not exactly sure how their product works, I saw from the landing page that it's broadly focused on query optimization.

We've done a lot of experimentation with query optimizations, both with and without LLMs, and we don't think it's possible to build a fully automated solution. However, a workflow solution might be feasible.


+1. We came to a similar conclusion when we were working on this idea.


We optimize warehouse sizes for a dbt project as a whole. Users can set a maximum project runtime as one of the parameters for experimentation. The optimization honors this max runtime while tuning warehouse sizes for individual models.


No AI yet - all algorithms are deterministic under the hood. Although we are considering tinkering with LLMs for query optimization, as part of our roadmap.


Consider applying for YC's Summer 2025 batch! Applications are open till May 13

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: