github.com/Jeffail/benthos/v3@v3.65.0/website/docs/components/processors/group_by_value.md (about)

     1  ---
     2  title: group_by_value
     3  type: processor
     4  status: stable
     5  categories: ["Composition"]
     6  ---
     7  
     8  <!--
     9       THIS FILE IS AUTOGENERATED!
    10  
    11       To make changes please edit the contents of:
    12       lib/processor/group_by_value.go
    13  -->
    14  
    15  import Tabs from '@theme/Tabs';
    16  import TabItem from '@theme/TabItem';
    17  
    18  
    19  Splits a batch of messages into N batches, where each resulting batch contains a
    20  group of messages determined by a
    21  [function interpolated string](/docs/configuration/interpolation#bloblang-queries) evaluated
    22  per message.
    23  
    24  ```yaml
    25  # Config fields, showing default values
    26  label: ""
    27  group_by_value:
    28    value: ${! meta("example") }
    29  ```
    30  
    31  This allows you to group messages using arbitrary fields within their content or
    32  metadata, process them individually, and send them to unique locations as per
    33  their group.
    34  
    35  The functionality of this processor depends on being applied across messages
    36  that are batched. You can find out more about batching [in this doc](/docs/configuration/batching).
    37  
    38  ## Fields
    39  
    40  ### `value`
    41  
    42  The interpolated string to group based on.
    43  This field supports [interpolation functions](/docs/configuration/interpolation#bloblang-queries).
    44  
    45  
    46  Type: `string`  
    47  Default: `"${! meta(\"example\") }"`  
    48  
    49  ```yaml
    50  # Examples
    51  
    52  value: ${! meta("kafka_key") }
    53  
    54  value: ${! json("foo.bar") }-${! meta("baz") }
    55  ```
    56  
    57  ## Examples
    58  
    59  If we were consuming Kafka messages and needed to group them by their key,
    60  archive the groups, and send them to S3 with the key as part of the path we
    61  could achieve that with the following:
    62  
    63  ```yaml
    64  pipeline:
    65    processors:
    66      - group_by_value:
    67          value: ${! meta("kafka_key") }
    68      - archive:
    69          format: tar
    70      - compress:
    71          algorithm: gzip
    72  output:
    73    aws_s3:
    74      bucket: TODO
    75      path: docs/${! meta("kafka_key") }/${! count("files") }-${! timestamp_unix_nano() }.tar.gz
    76  ```
    77