Parse CSV
✓
✓
✓
v1.45.0
+
Description
The Parse CSV Processor is utilized to parse CSV strings from specified fields within log, metric, or trace data. It's particularly useful when your telemetry data contains serialized CSV strings, and you need to convert them into a structured format for easier analysis and filtering. The processor supports specifying the source field and the target field for the parsed CSV data, offering flexibility in handling diverse data structures.
Use
When dealing with telemetry data that includes CSV strings embedded within logs, metrics, or traces, the Parse CSV Processor becomes instrumental. For instance, logs from certain applications or systems might contain CSV strings representing specific attributes or metadata. By utilizing the Parse CSV Processor, these CSV strings can be parsed and converted into structured data, enhancing readability and facilitating more complex queries and analyses.
Configuration
Telemetry Type
The type of telemetry to apply the processor to.
Condition
The condition to apply the CSV parsing. It supports OTTL expressions for logs, metrics, and traces. This field determines which telemetry data entries are processed based on their content and attributes.
Source Field Type
Determines the type of source field for logs, metrics, or traces. This can be Resource, Attribute, Body, or Custom for logs and Resource, Attribute, or Custom for metrics and traces. It defines where the processor should look to find the CSV string to parse.
Source Field
Specifies the exact field where the CSV string is located, based on the selected Source Field Type. For instance, if the Source Field Type is Attribute, this field should specify the particular attribute containing the CSV string.
Target Field Type
Like the Source Field Type, this field determines the type of target field for logs, metrics, or traces where the parsed CSV data will be stored. The options are similar, allowing users to store the parsed data as a resource, attribute, body, or in a custom field.
Target Field
Specifies the exact field where the parsed CSV data will be stored, based on the selected Target Field Type. This allows users to organize and structure the parsed data in a manner that facilitates easy querying and analysis.
Header Field Type
Like the Source Field Type, this field determines the type of header field for parsing the CSV line. The default option, Static String, allows you to specify the CSV headers as a fixed string. The other options are similar to Source Field, allowing users to select dynamic headers from a resource, attribute, body, or in a custom field.
Headers
Only relevant when Header Field Type is set to Static String. This is the static CSV header row to use when parsing.
Header Field
Specifies the exact field where the CSV header row is located. This header will be used to determine the fields to use when parsing the CSV string.
Delimiter
Specifies the delimiter to be used as the separator between fields. By default, "," is used.
Header Delimiter
Specifies the delimiter to be used for the header row, if it differs from the delimiter used in the CSV row. If unspecified, Delimiter is used as the header delimiter.
Mode
Specifies the mode to use when parsing. Strict mode follows normal CSV parsing rules. Lazy Quotes allows bare quotes in the middle of an unquoted field. Ignore Quotes ignores all quoting rules for CSV, splitting purely based on the delimiter.
Example Configurations
Parse CSV from Logs
In this example, we are looking to parse CSV strings from a log's body field and store the parsed data into the attributes field. The logs contain CSV strings detailing a web request, and we want to make this data more accessible.

Here is a sample log entry:
{
"body": "10.0.0.1\tGET\t200",
"attributes": {
"log.file.name": "example.log",
"log_type": "file"
}
}
We want to parse the CSV string from the Body and store it as structured data within the log entry. The configuration for the Parse CSV Processor would be:
Condition:
true
Source Field Type:
Body
Source Field: Left empty
Target Field Type:
Attribute
Target Field: Left empty
Header Field Type:
Static String
Headers:
ip,method,status
Delimiter:
Header Delimiter:
ip,method,status
Mode:
Strict
The resulting log entry after processing would be:
{
"body": "10.0.0.1\tGET\t200",
"attributes": {
"log.file.name": "example.log",
"log_type": "file",
"ip": "10.0.0.1",
"method": "GET",
"status": "200"
}
}
This structured format makes it easier to filter and analyze the log data based on the ip, method and status fields.
Last updated
Was this helpful?