From da6fd014448061df7ca3ffe71e469ab2f2d2e77b Mon Sep 17 00:00:00 2001 From: Callum Styan Date: Mon, 29 Aug 2022 07:40:03 -0700 Subject: [PATCH] [docs] document logfmt pipeline stage (#6238) I noticed earlier while writing a design doc that the `logfmt` pipeline stage is not documented. It was originally added here: https://github.com/grafana/loki/pull/4346 I've also fixed a minor typo in the json pipeline stage docs. Signed-off-by: Callum Styan Signed-off-by: Edward Welch Co-authored-by: Ed Welch Co-authored-by: Karen Miller <84039272+KMiller-Grafana@users.noreply.github.com> Co-authored-by: Edward Welch --- .../sources/clients/promtail/stages/_index.md | 2 + docs/sources/clients/promtail/stages/json.md | 2 +- .../sources/clients/promtail/stages/logfmt.md | 88 +++++++++++++++++++ 3 files changed, 91 insertions(+), 1 deletion(-) create mode 100644 docs/sources/clients/promtail/stages/logfmt.md diff --git a/docs/sources/clients/promtail/stages/_index.md b/docs/sources/clients/promtail/stages/_index.md index a0ecffa988..9d3bf7be3a 100644 --- a/docs/sources/clients/promtail/stages/_index.md +++ b/docs/sources/clients/promtail/stages/_index.md @@ -12,6 +12,7 @@ Parsing stages: - [cri](cri/): Extract data by parsing the log line using the standard CRI format. - [regex](regex/): Extract data using a regular expression. - [json](json/): Extract data by parsing the log line as JSON. + - [logfmt](logfmt/): Extract data by parsing the log line as logfmt. - [replace](replace/): Replace data using a regular expression. - [multiline](multiline/): Merge multiple lines into a multiline block. @@ -27,6 +28,7 @@ Action stages: - [labeldrop](labeldrop/): Drop label set for the log entry. - [labelallow](labelallow/): Allow label set for the log entry. - [labels](labels/): Update the label set for the log entry. + - [limit](limit/): Limit the rate lines will be sent to Loki. - [static_labels](static_labels/): Add static-labels to the log entry. - [metrics](metrics/): Calculate metrics based on extracted data. - [tenant](tenant/): Set the tenant ID value to use for the log entry. diff --git a/docs/sources/clients/promtail/stages/json.md b/docs/sources/clients/promtail/stages/json.md index 5b82fcac3c..5754c5baae 100644 --- a/docs/sources/clients/promtail/stages/json.md +++ b/docs/sources/clients/promtail/stages/json.md @@ -11,7 +11,7 @@ The `json` stage is a parsing stage that reads the log line as JSON and accepts ```yaml json: # Set of key/value pairs of JMESPath expressions. The key will be - # the key in the extracted data while the expression will the value, + # the key in the extracted data while the expression will be the value, # evaluated as a JMESPath from the source data. # # Literal JMESPath expressions can be done by wrapping a key in diff --git a/docs/sources/clients/promtail/stages/logfmt.md b/docs/sources/clients/promtail/stages/logfmt.md new file mode 100644 index 0000000000..7788ad2c77 --- /dev/null +++ b/docs/sources/clients/promtail/stages/logfmt.md @@ -0,0 +1,88 @@ +--- +title: logfmt +menuTitle: logfmt +description: The logfmt parsing stage reads logfmt log lines and extracts the data into labels. +--- +# `logfmt` stage + +The `logfmt` stage is a parsing stage that reads the log line as [logfmt](https://brandur.org/logfmt) and allows extraction of data into labels. + +## Schema + +```yaml +logfmt: + # Set of key/value pairs for mapping of logfmt fields to extracted labels. The YAML key will be + # the key in the extracted data, while the expression will be the YAML value. If the value + # is empty, then the logfmt field with the same name is extracted. + mapping: + [ : ... ] + + # Name from extracted data to parse. If empty, uses the log message. + [source: ] +``` + +This stage uses the [go-logfmt](https://github.com/go-logfmt/logfmt) unmarshaler, which means non-string types like +numbers or booleans will be unmarshaled into those types. The extracted data +can hold non-string values, and this stage does not do any type conversions; +downstream stages will need to perform correct type conversion of these values +as necessary. Please refer to the [`template` stage](../template/) for how +to do this. + +If the value extracted is a complex type, its value is extracted as a string. + +## Examples + +### Using log line + +For the given pipeline: + +```yaml +- logfmt: + mapping: + timestamp: time + app: + duration: + unknown: +``` + +Given the following log line: + +``` +time=2012-11-01T22:08:41+00:00 app=loki level=WARN duration=125 message="this is a log line" extra="user=foo"" +``` + +The following key-value pairs would be created in the set of extracted data: + +- `timestamp`: `2012-11-01T22:08:41+00:00` +- `app`: `loki` +- `duration`: `125` + +### Using extracted data + +For the given pipeline: + +```yaml +- logfmt: + mapping: + extra: +- logfmt: + mapping: + user: + source: extra +``` + +And the given log line: + +``` +time=2012-11-01T22:08:41+00:00 app=loki level=WARN duration=125 message="this is a log line" extra="user=foo" +``` + +The first stage would create the following key-value pairs in the set of +extracted data: + +- `extra`: `user=foo` + +The second stage will parse the value of `extra` from the extracted data as logfmt +and append the following key-value pairs to the set of extracted data: + +- `user`: `foo` \ No newline at end of file