Learn more
How to monitor Snowflake spend in Slack
How To

How to monitor Snowflake spend in Slack

Zach Mandell
January 9, 2024

Overview

In this post, I’ll show you how you we can use Snowflake’s ORGANIZATION_USAGE schema, to track warehouse compute costs across your organization. Each time you load data, execute queries, or perform DML (data manipulation language) operations, you’ll consume credits, which are then billed to your account at a certain rate (i.e., cost per credit).

Warehouse compute costs are just one line-item among others in your Snowflake bill, including storage, and other compute resources unrelated to warehouse compute. Performing a comprehensive accounting of Snoflake costs is a time-intensive exercise - there’s entire companies focused around tracking Snowflake spend. If this is your first time digging in, warehouse compute cost is the best place to start. It usually makes up the majority of spend and will vary depending on your team’s scope of usage and behavior.

Step 1: Check Snowflake Roles and Privileges

Before, getting started, let’s make sure we have the right permissions. You’ll need to:

  • Access the data from an account that has the ORGADMIN role enabled (i.e. the ORGADMIN account).
  • Have privileges to view the data. Within the ORGADMIN account, anyone who has access to the shared SNOWFLAKE database has access to the ORGANIZATION_USAGE schema. By default, only the ACCOUNTADMIN role has privileges to this database, which means the ORGADMIN role does not have the necessary privileges.

Step 2: Validate Privileges and Verify the Data

We’ll use two different views in the ORGANIZATION_USAGE schema to calculate warehouse compute costs:

  • ORGANIZATION_USAGE.WAREHOUSE_METERING_HISTORY - this contains hourly credit usage by account and warehouse across the organization.
  • ORGANIZATION_USAGE.RATE_SHEET_DAILY - this contains the daily effective rate or cost per credit for various services.

Snowflake provides either a legacy or upgraded version of the rate sheet view, so it’s important to check which version exists within your organization, as columns in the view will differ. I’ve included a query below, which works for our internal Snowflake instance, but you might need to adjust it depending on what exists in either table for your org.

Code Snippet Example

select 
  date(wm.end_time) as compute_date, 
  wm.warehouse_name as warehouse_name, 
  SUM(
    coalesce(
      wm.credits_used_compute * effective_rate, 
      0
    )
  ) as compute_spend 
from 
  ORGANIZATION_USAGE.WAREHOUSE_METERING_HISTORY wm 
  left join ORGANIZATION_USAGE.RATE_SHEET_DAILY rs on date(wm.end_time) = date(rs.date) 
  and rs.service_type = 'COMPUTE' 
  and rs.usage_type = 'compute' 
where 
  wm.service_type = 'WAREHOUSE_METERING' 
group by 
  1, 
  2

Step 3: Establish a Datasource Connection and Create a Metric in Push.ai

First, you’ll need to connect your Snowflake instance with Push.ai. If you have an existing connection, you’ll need to create an additional one which connects to the SNOWFLAKE database and may require a different user or role.

Next, use the query from the previous step to create a custom SQL metric in Push.ai. The date dimensions should be the data column and, we will SUM over the compute_spend column to arrive at a metric for warehouse compute cost.

Bonus: Build the metric into a semantic layer with one of our integration partners!

Step 4: Monitor the Warehouse Compute Cost Metric using Reports or Insights

Use Reports to easily track Snowflake spend each month. In this Report, I can track what my spend is for the current month, as well as where I’m projected to be at the end of the month.

We're here to help!

Get the Semantic Layer Guide!

Everything that a data leader needs to understand and deploy metrics at scale

Download The Full Guide

Core Semantic Layer Concepts

Benefits and ROI

Implementation Steps

Get started with the next generation of data applications

Create an account to connect your business and elevate how your operate.

ABOUT THE AUTHOR
Zach Mandell

Zach is the CEO of Push.ai. He's been an engineer, data analyst and loves building whether it is with his hands or on the keyboard. You can find him surfing the Pacific coast of Mexico, or looking for the best spot in town for a loaf of sourdough.

Enjoyed this read?

Stay up to date with the latest product updates and insights sent straight to your inbox!

Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.