Pipeline Intelligence

Pipeline Intelligence is a suite of recommendations and AI-powered features that help you automate your pipeline configuration. Pipeline Intelligence allows you to avoid manual configuration and complex OTTL syntax, and instead use natural language descriptions and intelligent data analysis to quickly build and optimize your pipelines.

With Pipeline Intelligence, you can:

  • Automatically identify and categorize log types from your telemetry data

  • Standardize log types for Google SecOps ingestion

  • Generate processors using natural language

  • Parse complex telemetry into structured data

  • Get intelligent recommendations for pipeline improvements

Pipeline Intelligence Suggestions

  • Pipeline Intelligence analyzes your telemetry data and provides context-aware suggestions that can help improve your pipeline. These suggestions do things like add necessary fields, remove redundant fields, and parse data.

Snapshot View

Within the expanded snapshot row view, there are several helpful Pipeline Intelligence features for logs. When expanding the row, Pipeline Intelligence will automatically detect the log's log type and body format. Actions will appear for parsing or standardization, if needed.

AI Features

Get Log Types

Automatically identify log types from your log snapshot data.

How it works:

  1. Click "Get Log Types" from the Pipeline Intelligence panel

  2. Pipeline Intelligence will begin to analyze and stream output of generated log types.

  3. Log types are automatically identified and displayed as chips in the snapshot console.

  4. You can click on any log type chip to bring up additional actions to take on that log type.

Standardize Log Type for SecOps

Automatically generate a Google SecOps standardization processor for specific log types.

How it works:

  1. Click into a processor node that has a Google SecOps source connected to it.

  2. Generate log types for the snapshot (steps shown above).

  3. After generating log types, Pipeline Intelligence will recommend a new action: "Standardize Log Type for SecOps".

  4. Select a log type from the drop down (or choose "All Log Types" to standardize multiple types)

  5. Click "Generate" to create the standardization processor with the appropriate log type and conditional statement

Generate Processors

Create processors using natural language descriptions of what you want to accomplish.

How it works:

  1. Enter a description in the Pipeline Intelligence input field

    1. Examples:

      1. "Filter my logs to only let Windows Events through"

      2. "Batch my logs to send to Google SecOps"

      3. "Create a new attribute to keep track of the host name."

      4. "Parse JSON logs and extract the user_id field"

  2. Click "Generate"

  3. Pipeline Intelligence will analyze your pipeline and create processors to accomplish your goal.

  4. Processors are automatically added to your pipeline. You may modify or delete the generated processors.

Parse Field

Automatically create parsing processors to extract structured data from input fields.

How it works:

  1. In the snapshot console, click on any log body, attribute, or resource field.

  2. Select "Parse Field" from the Pipeline Intelligence menu

  3. Review the field preview showing the data to be parsed

  4. Click "Generate Parser" to create the appropriate parsing processor

  5. Pipeline Intelligence detects the format of log (JSON, CSV, Key-Value, XML, other) and creates the corresponding processor to parse fields.

Parse with Regex

The Parse with Regex processor contains a "Generate with Pipeline Intelligence" button. This button behaves similar to Parse Field, but solely focuses on creating a regular expression.

How it works:

  1. Specify a Source Field Type and Source Field (leave empty to use the body).

  2. Click "Generate with Pipeline Intelligence"

  3. Pipeline Intelligence will generate a regex to parse the specified field.

Best Practices

  • Always review AI-generated processors before deploying to production. While AI is designed to create correct configurations, it may make mistakes.

  • Things to verify:

    • Field paths match your actual data structure

    • Conditions and filters work as expected

    • Processor order is correct for your use case

  • Begin with simple requests and gradually add complexity.

  • Use multiple Pipeline Intelligence features together:

    • Use "Get Log Types" to identify log types

    • Use "Standardize Log Types for SecOps" on those log types if needed

    • Use "Parse Field" to parse fields if needed

    • Use "Generate Processors" to add any transformations

Last updated

Was this helpful?