Google Cloud Storage
Supported Types
✓
✓
✓
Notes
The Google Cloud Storage destination saves telemetry as OTLP JSON objects in Google Cloud Storage.
The destination will create a bucket if it doesn't exist. Bucket names in GCS are globally unique, so if any organization contains a bucket with the given name, it will fail to create and will likely return
403 Forbidden
codes when it tries to write. You can manually create the bucket in GCS to ensure the name is not taken. More info, here.Your credentials must have the Storage Admin permission to create buckets, folders, and objects.
Configuration
telemetry_types*
telemetrySelector
Logs, Metrics, Traces
Specifies which types of telemetry to export.
bucket_name*
string
""
The name of the bucket to store objects in. Must be globally unique.
project_id
string
""
The ID of the Google Cloud project the bucket belongs to. Will be read from credentials if not configured.
auth_type
enum
auto
The method used for authenticating to Google Cloud. Valid values are "auto", "json", or "file".
credentials
string
""
JSON value from a Google Service Account credential file. Required if auth_type is "json".
credentials_file
string
""
Path to a Google Service Account credential file. Required if auth_type is "file".
bucket_storage_class
enum
STANDARD
The storage class of the bucket. Only used during bucket creation. More info
partition
enum
minute
The granularity of the timestamps in the object path, either "minute" or "hour".
compression
enum
none
The compression algorithm to use when exporting telemetry, either "none" or "gzip".
folder_name
string
""
An optional folder to put the objects in. Can be a nested folder path.
object_prefix
string
""
An optional prefix to prepend to the object file name.
*required field
Supported Retry and Queuing Settings
This destination supports the following retry and queuing settings:
✓
✓
✓
Example Configuration
Basic Configuration
For a basic configuration, we specify the bucket_name
and use the default location and storage class. The project ID will be read from credentials. We update the compression to use gzip
and we specify a folder name and object prefix.
The object paths in GCS will look like this:
test-folder/year=2025/month=03/day=02/hour=06/minute=00/test-prefix_metrics_{random_id}.json.gz
test-folder/year=2025/month=03/day=02/hour=06/minute=00/test-prefix_logs_{random_id}.json.gz
test-folder/year=2025/month=03/day=02/hour=06/minute=00/test-prefix_traces_{random_id}.json.gz
Web Interface

Standalone Destination
apiVersion: bindplane.observiq.com/v1
kind: Destination
metadata:
id: googlecloudstorage
name: googlecloudstorage
spec:
type: googlecloudstorage
parameters:
- name: telemetry_types
value: ['Logs', 'Metrics', 'Traces']
- name: bucket_name
value: 'test-gcs-basic-configuration-bucket'
- name: project_id
value: ''
- name: auth_type
value: 'auto'
- name: bucket_location
value: 'US'
- name: bucket_storage_class
value: 'STANDARD'
- name: partition
value: 'minute'
- name: compression
value: 'gzip'
- name: folder_name
value: 'test-folder'
- name: object_prefix
value: 'test-prefix_'
Last updated
Was this helpful?