# Snowflake

### Description

This destination can send logs, metrics, and traces to Snowflake, a cloud data warehouse service.

### Supported Types

| Logs | Metrics | Traces | Bindplane Collector |
| ---- | ------- | ------ | ------------------- |
| ✓    | ✓       | ✓      | `v1.45.0`+          |

### Prerequisites

* Snowflake [data warehouse](https://docs.snowflake.com/en/user-guide/warehouses-overview)
* Snowflake user with appropriate privileges

The following guide will detail how to get a user initialized in Snowflake that can be used with this destination. It is meant to get telemetry flowing with limited time spent configuring. If you'd like to take an alternative approach, check out the exporter documentation on this topic [here](https://github.com/observIQ/bindplane-otel-collector/tree/main/exporter/snowflakeexporter#alternatives).

Snowflake has a variety of ways to connect to it, but these instructions will be tailored for "Classic Console" as all accounts have access to it.

Before starting, log in to Classic Console using a user that has access to the `ACCOUNTADMIN` role or another role in you Snowflake account that has permission to grant privileges and create users. If the default role is not the required one, then you'll need to assume that role using this SQL command (replace the role as needed):

```sql
ASSUME ROLE "ACCOUNTADMIN";
```

These instructions will grant privileges to one of the default roles Snowflake is initialized with, `SYSADMIN`. If you want to grant privileges to a different role then just switch out `SYSADMIN` for your role in the SQL commands.

#### 1. Grant Warehouse Usage

First, we need to grant the `USAGE` privilege to the `SYSADMIN` role on the data warehouse telemetry data will be stored in. Run this SQL command next (replace `TEST` with your warehouse name):

```sql
GRANT USAGE ON WAREHOUSE "TEST" TO ROLE "SYSADMIN";
```

#### 2. Grant Create Database Privilege

Next the `SYSADMIN` role needs to be granted the ability to create databases in the Snowflake account. Run the following SQL to do so:

```sql
GRANT CREATE DATABASE ON ACCOUNT TO ROLE "SYSADMIN";
```

#### 3. Create New User For Bindplane

Now a new user needs to be created that the Bindplane Collector can login as. The user should also have the default role assigned as `SYSADMIN`, although it isn't necessary.

**Note:** If the default role is not assigned, then the exporter will need to be configured with the correct role to work.

Remember the login name and password you use and configure the destination with these values. Replace the user, password, and login name in the following SQL to match yours:

```sql
CREATE USER BP_AGENT PASSWORD="password" LOGIN_NAME="BP_AGENT" DEFAULT_ROLE="SYSADMIN";
```

#### 4. Grant Privilege to SYSADMIN Role

Even though the default role was set as `SYSADMIN` we still need to grant the new account permission to it. This can be done using the next SQL command (replace user as needed):

```sql
GRANT ROLE "SYSADMIN" TO USER BP_AGENT;
```

Now we have a Snowflake user with the correct permissions to be able to create a database, schemas, and tables and also use the configured warehouse to store telemetry data in.

### Configuration

<table><thead><tr><th width="159.71484375">Field</th><th>Description</th></tr></thead><tbody><tr><td>Choose Telemetry Type</td><td>The kinds of telemetry to send to Snowflake.</td></tr><tr><td>Account Identifier</td><td>The account identifier for the Snowflake account that data should be sent to.</td></tr><tr><td>Warehouse</td><td>THe Snowflake warehouse that telemetry data should be stored in.</td></tr><tr><td>Username</td><td>Username the exporter will use to authenticate with Snowflake.</td></tr><tr><td>Password</td><td>Password the exporter will use to authenticate with Snowflake.</td></tr><tr><td>Database</td><td>The Snowflake database that telemetry schemas will be stored in. Will be created if it doesn't exist.</td></tr><tr><td>Log Schema</td><td>The schema that the log table will be stored in. Will be created if it doesn't exist.</td></tr><tr><td>Log Table</td><td>The table that logs will be stored in. Will be created if it doesn't exist.</td></tr><tr><td>Metric Schema</td><td>The schema that the metric tables will be stored in. Will be created if it doesn't exist.</td></tr><tr><td>Metric Table</td><td>The prefix used for metric tables. Tables are created if they don't exist. See this <a href="https://github.com/observIQ/bindplane-otel-collector/tree/main/exporter/snowflakeexporter#metrics">exporter documentation</a> for more.</td></tr><tr><td>Trace Schema</td><td>The schema that the trace table will be stored in. Will be created if it doesn't exist.</td></tr><tr><td>Trace Table</td><td>The table that traces will be stored in. Will be created if it doesn't exist.</td></tr><tr><td>Role</td><td>The Snowflake role the exporter should use. Only required if the default role of the provided credentials does not have correct privileges.</td></tr><tr><td>Parameters</td><td>Additional optional parameters the exporter should use when connecting to Snowflake. This option is generally not required. See this <a href="https://docs.snowflake.com/en/sql-reference/parameters">Snowflake documentation</a> for more.</td></tr></tbody></table>

This destination supports the [retry settings](https://docs.bindplane.com/configuration/bindplane-otel-collector/retry-on-failure), the [sending queue settings](https://docs.bindplane.com/configuration/bindplane-otel-collector/sending-queue), and the [persistent queue settings.](https://docs.bindplane.com/configuration/bindplane-otel-collector/persistent-queue)

| Sending Queue | Persistent Queue | Retry on Failure |
| ------------- | ---------------- | ---------------- |
| ✓             | ✓                | ✓                |

### Example Configuration

Here we will configure this destination to send all telemetry types to a Snowflake account using the default values for database resources. We'll also configure the sending queue, persistent queue, and retry on failure.

**Web Interface**

<figure><img src="https://1405008107-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2FgmiOMzBfoNFwmKJFHMcJ%2Fuploads%2Fgit-blob-7a49d7c5aee8ab2bec12353b62d00d85215b8b15%2Fintegrations-destinations-snowflake-image-1.png?alt=media" alt="Bindplane docs - Snowflake - image 1"><figcaption></figcaption></figure>

<figure><img src="https://1405008107-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2FgmiOMzBfoNFwmKJFHMcJ%2Fuploads%2Fgit-blob-cebdefe3952a5cbee509cf7f9c85a5e4e1ff8526%2Fintegrations-destinations-snowflake-image-2.png?alt=media" alt="Bindplane docs - Snowflake - image 2"><figcaption></figcaption></figure>

<figure><img src="https://1405008107-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2FgmiOMzBfoNFwmKJFHMcJ%2Fuploads%2Fgit-blob-dd64ec93a2eed83ef483db833403c391b214a18a%2Fintegrations-destinations-snowflake-image-3.png?alt=media" alt="Bindplane docs - Snowflake - image 3"><figcaption></figcaption></figure>

**Standalone Destination**

```yaml
apiVersion: bindplane.observiq.com/v1
kind: Destination
metadata:
  id: snowflake
  name: snowflake
spec:
  type: snowflake
  parameters:
    - name: telemetry_types
      value: ['Logs', 'Metrics', 'Traces']
    - name: account_id
      value: 'my_account_id'
    - name: warehouse
      value: 'my_warehouse'
    - name: username
      value: 'my_user'
    - name: password
      value: 'my_pass'
    - name: sending_queue_enabled
      value: true
    - name: retry_on_failure_enabled
      value: true
    - name: persistent_queue_enabled
      value: true
    - name: persistent_queue_storage_extension
      value: 
        type: file_storage_persistent_queue
        parameters:
          - name: persistent_queue_directory
            value: ${OIQ_OTEL_COLLECTOR_HOME}/storage
          - name: timeout
            value: 1
          - name: persistent_queue_fsync
            value: true
```


---

# Agent Instructions: Querying This Documentation

If you need additional information that is not directly available in this page, you can query the documentation dynamically by asking a question.

Perform an HTTP GET request on the current page URL with the `ask` query parameter:

```
GET https://docs.bindplane.com/integrations/destinations/snowflake.md?ask=<question>
```

The question should be specific, self-contained, and written in natural language.
The response will contain a direct answer to the question and relevant excerpts and sources from the documentation.

Use this mechanism when the answer is not explicitly present in the current page, you need clarification or additional context, or you want to retrieve related documentation sections.
