Kumo’s Native App uses Snowflake resources depending on the size of the data and the complexity of the models being trained. These resources are primarily consumed in two ways:
Kumo runs two key components on Snowpark Container Services:
Kumo Control Plane (UI and API management).
Kumo AI Engine (model training and inference).
How Resource Consumption is Measured: As of 11-Feb-2025 Snowflake charges based on:
Hardware configuration (CPU, GPU, Memory allocated to the compute instance used).
Duration for which the compute instance runs
Compute Resources:
Both the Control Plane and AI Engine run on a single GPU compute instance in SPCS.
As of 11-Feb-2025, the available instance types are GPU_NV_M and GPU_NV_L, selected at app launch.
Managing Resources Efficiently:
The compute resource is used continuously while the app is running.
The app can be suspended without losing data to save resources.
Instructions on suspending the app are available here.
Estimated Credit Usage: The table below provides an estimate of Snowflake credits consumed per hour based on the selected container size. The most up to date numbers can be found in the official Snowflake Credit Consumption Table.