# Filelog

{% hint style="danger" %}
**WARNING**

This source offers a delete\_after\_read option that can be hazardous. When this option is combined with file globbing, it will delete every file that matches the globbing pattern. Use with caution and care.
{% endhint %}

***

### Supported Platforms

| Platform | Metrics | Logs | Traces |
| -------- | ------- | ---- | ------ |
| Linux    |         | ✓    |        |
| Windows  |         | ✓    |        |
| macOS    |         | ✓    |        |

### Configuration Table

<table><thead><tr><th width="227.37109375">Parameter</th><th width="122.4765625">Type</th><th width="87.3125">Default</th><th>Description</th></tr></thead><tbody><tr><td>file_path*</td><td><code>strings</code></td><td></td><td>File or directory paths to tail for logs.</td></tr><tr><td>exclude_file_path</td><td><code>strings</code></td><td>""</td><td>File or directory paths to exclude.</td></tr><tr><td>log_type</td><td><code>string</code></td><td>"file"</td><td>A friendly name that will be added to each log entry as an attribute.</td></tr><tr><td>multiline_line_start_pattern</td><td><code>string</code></td><td></td><td>Regex pattern that matches the beginning of a log entry for handling multiline logs.</td></tr><tr><td>multiline_line_end_pattern</td><td><code>string</code></td><td></td><td>Regex pattern that matches the end of a log entry, useful for terminating parsing of multiline logs.</td></tr><tr><td>encoding</td><td><code>enum</code></td><td>utf-8</td><td>The encoding of the file being read. Valid values are <code>nop</code>, <code>utf-8</code>, <code>utf-16le</code>, <code>utf-16be</code>, <code>ascii</code>, and <code>big5</code>.</td></tr><tr><td>compression</td><td><code>enum</code></td><td>none</td><td>Indicate the compression format of input files. If set accordingly, files will be read using a reader that decompresses the file before scanning its content. Valid values are <code>none</code> or <code>gzip</code>. Requires <code>start_at</code> to be set to <code>beginning</code>. Ensure that your collector has permission to decompress the file.</td></tr><tr><td>include_file_name_attribute</td><td><code>bool</code></td><td>true</td><td>Whether to add the file name as the attribute <code>log.file.name</code>.</td></tr><tr><td>include_file_path_attribute</td><td><code>bool</code></td><td>false</td><td>Whether to add the file path as the attribute <code>log.file.path</code>.</td></tr><tr><td>include_file_name_resolved</td><td><code>bool</code></td><td>false</td><td>Whether to add the file name after symlinks resolution as the attribute <code>log.file.name_resolved</code>.</td></tr><tr><td>include_file_path_resolved</td><td><code>bool</code></td><td>false</td><td>Whether to add the file path after symlinks resolution as the attribute <code>log.file.path_resolved</code>.</td></tr><tr><td>delete_after_read</td><td><code>bool</code></td><td>false</td><td>Whether to delete the file(s) after reading. Only valid in combination start_at: beginning.</td></tr><tr><td>offset_storage_dir</td><td><code>string</code></td><td>$OIQ_OTEL_COLLECTOR_HOME/storage</td><td>The directory where the offset storage file will be created. It is okay if multiple receivers use the same directory. By default, the <a href="https://github.com/observIQ/bindplane-otel-collector">Bindplane Distro for OpenTelemetry Collector</a> sets <code>$OIQ_OTEL_COLLECTOR_HOME</code> in its runtime.</td></tr><tr><td>poll_interval</td><td><code>int</code></td><td>200</td><td>The duration of time in milliseconds between filesystem polls.</td></tr><tr><td>max_concurrent_files</td><td><code>int</code></td><td>1024</td><td>The maximum number of log files from which logs will be read concurrently. If the number of files matched exceeds this number, then files will be processed in batches.</td></tr><tr><td>parse_to</td><td><code>string</code></td><td>body</td><td>The <a href="https://github.com/open-telemetry/opentelemetry-collector-contrib/blob/main/pkg/stanza/docs/types/field.md">field</a> that the log will be parsed to. Some exporters handle logs favorably when parsed to <code>attributes</code> over <code>body</code> and vice versa.</td></tr><tr><td>start_at</td><td><code>enum</code></td><td>end</td><td>Start reading the file from the 'beginning' or 'end'.</td></tr></tbody></table>

<mark style="color:red;">\*</mark>*<mark style="color:red;">required field</mark>*

#### Parsing

For the latest version of the filelog source, parsing capabilities have been removed. To replace this functionality, add a processor to your pipeline.

To parse JSON data from a file, add the [Parse JSON Processor](/integrations/processors/parse-json.md).

To parse data from a file using regex, add the [Parse with Regex Processor](/integrations/processors/parse-with-regex.md).


---

# Agent Instructions: Querying This Documentation

If you need additional information that is not directly available in this page, you can query the documentation dynamically by asking a question.

Perform an HTTP GET request on the current page URL with the `ask` query parameter:

```
GET https://docs.bindplane.com/integrations/sources/filelog.md?ask=<question>
```

The question should be specific, self-contained, and written in natural language.
The response will contain a direct answer to the question and relevant excerpts and sources from the documentation.

Use this mechanism when the answer is not explicitly present in the current page, you need clarification or additional context, or you want to retrieve related documentation sections.
