summaryrefslogtreecommitdiff
path: root/docs/processor
diff options
context:
space:
mode:
authordoufenghu <[email protected]>2024-07-13 17:21:53 +0800
committerdoufenghu <[email protected]>2024-07-13 17:21:53 +0800
commite2196956bdc8a9737a5bbacf8a20823020936b55 (patch)
tree115fec75a114e47f76c84999382a3be44ea49c90 /docs/processor
parent321907759e968741690d691f43d1527a2b32fc4b (diff)
[Improve][Test] IT (integration test) has become optional, and it is no longer executed by default during the mvn compile and deploy process. In the job template for processor and filter, describe the implementation type based on identifiers.
Diffstat (limited to 'docs/processor')
-rw-r--r--docs/processor/projection-processor.md19
-rw-r--r--docs/processor/udf.md382
2 files changed, 223 insertions, 178 deletions
diff --git a/docs/processor/projection-processor.md b/docs/processor/projection-processor.md
index 65c7545..bc4b249 100644
--- a/docs/processor/projection-processor.md
+++ b/docs/processor/projection-processor.md
@@ -1,20 +1,26 @@
# Projection Processor
+
> Processing pipelines for projection processor
+
## Description
+
Projection processor is used to project the data from source to sink. It can be used to filter, remove, and transform fields.
It is a part of the processing pipeline. It can be used in the pre-processing, processing, and post-processing pipeline. Each processor can assemble UDFs(User-defined functions) into a pipeline.
Within the pipeline, events are processed by each Function in order, top‑>down. The UDF usage detail can be found in [UDF](udf.md).
+
## Options
-| name | type | required | default value |
-|----------------|---------|----------|----------------------------------------------------------------------------------------------------------------|
-| type | String | Yes | The type of the processor, now only support `com.geedgenetworks.core.processor.projection.ProjectionProcessor` |
-| output_fields | Array | No | Array of String. The list of fields that need to be kept. Fields not in the list will be removed. |
-| remove_fields | Array | No | Array of String. The list of fields that need to be removed. |
-| functions | Array | No | Array of Object. The list of functions that need to be applied to the data. |
+| name | type | required | default value |
+|---------------|--------|----------|----------------------------------------------------------------------------------------------------------------|
+| type | String | Yes | The type of the processor, now only support `com.geedgenetworks.core.processor.projection.ProjectionProcessor` |
+| output_fields | Array | No | Array of String. The list of fields that need to be kept. Fields not in the list will be removed. |
+| remove_fields | Array | No | Array of String. The list of fields that need to be removed. |
+| functions | Array | No | Array of Object. The list of functions that need to be applied to the data. |
## Usage Example
+
This example use projection processor to remove the fields `http_request_line`, `http_response_line`, `http_response_content_type` and using DROP function filter all event that `server_ip` is `4.4.4.4`.
+
```yaml
sources:
inline_source:
@@ -64,4 +70,3 @@ application:
downstream: []
```
-
diff --git a/docs/processor/udf.md b/docs/processor/udf.md
index 6bd87b0..2a705fd 100644
--- a/docs/processor/udf.md
+++ b/docs/processor/udf.md
@@ -1,6 +1,8 @@
# UDF
+
> The functions for projection processors.
-## Function of content
+>
+ ## Function of content
- [Asn Lookup](#asn-lookup)
- [Base64 Decode](#base64-decode)
@@ -21,26 +23,30 @@
- [Unix Timestamp Converter](#unix-timestamp-converter)
## Description
+
UDF(User Defined Function) is used to extend the functions of projection processor. The UDF is a part of the processing pipeline. It can be used in the pre-processing pipeline, processing pipeline, and post-processing pipeline.
+
## UDF Definition
+
A UDF includes the following parts: name, event(processing data), context, evaluate function, open function, and close function.
- name: Function name, with uppercase words separated by underscores, used for function registration.
-- event: The data to be processed. It is organized in a Map<String, Object> structure.
+- event: The data to be processed. It is organized in a Map<String, Object> structure.
- context: Function context, used to store the state of the function. Including the following parameters:
- - `filter`: Filter expression, string type. It is used to filter events that need to processed by the function. The expression is written in Aviator expression language. For example, `event.server_ip == '.
- - `lookup_fields`: The fields that need to be used as lookup keys. It is an array of string type. For example, `['server_ip', 'client_ip']`.
- - `output_fields`: The fields are used to append the result to the event. It is an array of string type. For example, `['server_ip', 'client_ip']`. If the field already exists in the event, the value will be overwritten.
- - `parameters`: Custom parameters. It is a Map<String, Object> type.
-- evaluate function: The function to process the event. It is a function that returns a Map<String, Object> type.
+- `filter`: Filter expression, string type. It is used to filter events that need to processed by the function. The expression is written in Aviator expression language. For example, `event.server_ip == '.
+- `lookup_fields`: The fields that need to be used as lookup keys. It is an array of string type. For example, `['server_ip', 'client_ip']`.
+- `output_fields`: The fields are used to append the result to the event. It is an array of string type. For example, `['server_ip', 'client_ip']`. If the field already exists in the event, the value will be overwritten.
+- `parameters`: Custom parameters. It is a Map<String, Object> type.
+- evaluate function: The function to process the event. It is a function that returns a Map<String, Object> type.
- open function: Initialize the resources used by the function.
- close function: Release the resources used by the function.
-
+
### Functions
Function define common parameters: `filter`, `lookup_fields`, `output_fields`, `parameters`, and will return a Map<String, Object> value of the event.
-``` FUNCTION_NAME(filter, lookup_fields, output_fields[, parameters])```
+``` FUNCTION_NAME(filter, lookup_fields, output_fields[, parameters])```
### Asn Lookup
+
Asn lookup function is used to lookup the asn information by ip address. You need to host the `.mmdb` database file from Knowledge Base Repository.
```ASN_LOOKUP(filter, lookup_fields, output_fields[, parameters])```
@@ -48,20 +54,22 @@ Asn lookup function is used to lookup the asn information by ip address. You nee
- lookup_fields: required
- output_fields: required
- parameters: required
- - kb_name: required. The name of the knowledge base.
- - option: required. Now only support `IP_TO_ASN`.
+- kb_name: required. The name of the knowledge base.
+- option: required. Now only support `IP_TO_ASN`.
Example:
+
```yaml
- - function: ASN_LOOKUP
- lookup_fields: [client_ip]
- output_fields: [client_asn]
- parameters:
- kb_name: tsg_ip_asn
- option: IP_TO_ASN
+- function: ASN_LOOKUP
+ lookup_fields: [client_ip]
+ output_fields: [client_asn]
+ parameters:
+ kb_name: tsg_ip_asn
+ option: IP_TO_ASN
```
### Base64 Decode
+
Base64 decode function is used to decode the base64 encoded string.
```BASE64_DECODE_TO_STRING(filter, output_fields[, parameters])```
@@ -69,19 +77,21 @@ Base64 decode function is used to decode the base64 encoded string.
- lookup_fields: not required
- output_fields: required
- parameters: required
- - value_field: `<String>` required.
- - charset_field:`<String>` optional. Default is `UTF-8`.
+- value_field: `<String>` required.
+- charset_field:`<String>` optional. Default is `UTF-8`.
Example:
+
```yaml
- - function: BASE64_DECODE_TO_STRING
- output_fields: [mail_attachment_name]
- parameters:
- value_field: mail_attachment_name
- charset_field: mail_attachment_name_charset
+- function: BASE64_DECODE_TO_STRING
+ output_fields: [mail_attachment_name]
+ parameters:
+ value_field: mail_attachment_name
+ charset_field: mail_attachment_name_charset
```
### Base64 Encode
+
Base64 encode function is commonly used to encode the binary data to base64 string. Especially when that data need to be stored and transferred over media that are designed to deal with text. This encoding helps to ensure that the data remains intact without modification during transport.
```BASE64_ENCODE_TO_STRING(filter, output_fields[, parameters])```
@@ -89,17 +99,19 @@ Base64 encode function is commonly used to encode the binary data to base64 stri
- lookup_fields: not required
- output_fields: required
- parameters: required
- - value_field: `<String>` required.
+- value_field: `<String>` required.
Example:
+
```yaml
- - function: BASE64_ENCODE_TO_STRING
- output_fields: [packet]
- parameters:
- value_field: packet
+- function: BASE64_ENCODE_TO_STRING
+ output_fields: [packet]
+ parameters:
+ value_field: packet
```
### Current Unix Timestamp
+
Current unix timestamp function is used to get the current unix timestamp.
```CURRENT_UNIX_TIMESTAMP(output_fields[, parameters])```
@@ -107,17 +119,19 @@ Current unix timestamp function is used to get the current unix timestamp.
- lookup_fields: not required
- output_fields: required
- parameters: optional
- - precision: `<String>` optional. Default is `seconds`. Enum: `milliseconds`, `seconds`.
+- precision: `<String>` optional. Default is `seconds`. Enum: `milliseconds`, `seconds`.
Example:
+
```yaml
- - function: CURRENT_UNIX_TIMESTAMP
- output_fields: [recv_time]
- parameters:
- precision: seconds
+- function: CURRENT_UNIX_TIMESTAMP
+ output_fields: [recv_time]
+ parameters:
+ precision: seconds
```
### Domain
+
Domain function is used to extract the domain from the url.
```DOMAIN(filter, lookup_fields, output_fields[, parameters])```
@@ -125,23 +139,25 @@ Domain function is used to extract the domain from the url.
- lookup_fields: required. Support more than one fields. All fields will be processed from left to right, and the result will be overwritten if the field processed value is not null.
- output_fields: required
- parameters: required
- - option: `<String>` required. Enum: `TOP_LEVEL_DOMAIN`, `FIRST_SIGNIFICANT_SUBDOMAIN`.
+- option: `<String>` required. Enum: `TOP_LEVEL_DOMAIN`, `FIRST_SIGNIFICANT_SUBDOMAIN`.
+
+#### Option
-#### Option
- `TOP_LEVEL_DOMAIN` is used to extract the top level domain from the url. For example, `www.abc.com` will be extracted to `com`.
- `FIRST_SIGNIFICANT_SUBDOMAIN` is used to extract the first significant subdomain from the url. For example, `www.abc.com` will be extracted to `abc.com`.
Example:
```yaml
- - function: DOMAIN
- lookup_fields: [http_host, ssl_sni, quic_sni]
- output_fields: [server_domain]
- parameters:
- option: FIRST_SIGNIFICANT_SUBDOMAIN
+- function: DOMAIN
+ lookup_fields: [http_host, ssl_sni, quic_sni]
+ output_fields: [server_domain]
+ parameters:
+ option: FIRST_SIGNIFICANT_SUBDOMAIN
```
### Drop
+
Drop function is used to filter the event. If the filter expression is true, the event will be dropped. Otherwise, the event will be passed to downstream.
```DROP(filter)```
@@ -151,11 +167,14 @@ Drop function is used to filter the event. If the filter expression is true, the
- parameters: not required
Example:
+
```yaml
- - function: DROP
- filter: event.server_ip == '4.4.4.4'
+- function: DROP
+ filter: event.server_ip == '4.4.4.4'
```
+
### Eval
+
Eval function is used to adds or removes fields from events by evaluating an value expression.
```EVAL(filter, output_fields[, parameters])```
@@ -163,25 +182,30 @@ Eval function is used to adds or removes fields from events by evaluating an val
- lookup_fields: not required
- output_fields: required
- parameters: required
- - value_expression: `<String>` required. Enter a value expression to set the field’s value – this can be a constant.
+- value_expression: `<String>` required. Enter a value expression to set the field’s value – this can be a constant.
Example 1:
Add a field `ingestion_time` with value `recv_time`:
+
```yaml
- - function: EVAL
- output_fields: [ingestion_time]
- parameters:
- value_expression: recv_time
+- function: EVAL
+ output_fields: [ingestion_time]
+ parameters:
+ value_expression: recv_time
```
+
Example 2:
If the value of `direction` is `69`, the value of `internal_ip` will be `client_ip`, otherwise the value of `internal_ip` will be `server_ip`.
+
```yaml
- - function: EVAL
- output_fields: [internal_ip]
- parameters:
- value_expression: 'direction=69 ? client_ip : server_ip'
+- function: EVAL
+ output_fields: [internal_ip]
+ parameters:
+ value_expression: 'direction=69 ? client_ip : server_ip'
```
+
### Flatten
+
Flatten the fields of nested structure to the top level. The new fields name are named using the field name prefixed with the names of the struct fields to reach it, separated by dots as default.
```FLATTEN(filter, lookup_fields, output_fields[, parameters])```
@@ -189,33 +213,36 @@ Flatten the fields of nested structure to the top level. The new fields name are
- lookup_fields: optional
- output_fields: not required
- parameters: optional
- - prefix: `<String>` optional. Prefix string for flattened field names. Default is empty.
- - depth: `<Integer>` optional. Number representing the nested levels to consider for flattening. Minimum 1. Default is `5`.
- - delimiter: `<String>` optional. The string used to join nested keys Default is `.`.
- - json_string_keys: `<Array>` optional. The keys of the json string fields. It indicates keys that contain JSON strings and should be parsed and flattened. Default is empty.
+- prefix: `<String>` optional. Prefix string for flattened field names. Default is empty.
+- depth: `<Integer>` optional. Number representing the nested levels to consider for flattening. Minimum 1. Default is `5`.
+- delimiter: `<String>` optional. The string used to join nested keys Default is `.`.
+- json_string_keys: `<Array>` optional. The keys of the json string fields. It indicates keys that contain JSON strings and should be parsed and flattened. Default is empty.
Example 1:
Flatten the nested structure of fields and tags in Metrics. If lookup_fields is empty, flatten all nested structures.
```yaml
- - function: FLATTEN
- lookup_fields: [tags,fields]
+- function: FLATTEN
+ lookup_fields: [tags,fields]
```
Example 2:
Flatten the nested structure of the session record field `encapsulation` (JSON String format), add the prefix `tunnels`, specify the nesting depth as `3`, and use an dot "." as the delimiter.
+
```yaml
- - function: FLATTEN
- lookup_fields: [encapsulation]
- parameters:
- prefix: tunnels
- depth: 3
- delimiter: .
- json_string_keys: [encapsulation]
+- function: FLATTEN
+ lookup_fields: [encapsulation]
+ parameters:
+ prefix: tunnels
+ depth: 3
+ delimiter: .
+ json_string_keys: [encapsulation]
```
+
Output:
+
```json
{
"tunnels.encapsulation.ipv4.client_ip:": "192.168.11.12",
@@ -224,6 +251,7 @@ Output:
```
### From Unix Timestamp
+
From unix timestamp function is used to convert the unix timestamp to date time string. The default time zone is UTC+0.
```FROM_UNIX_TIMESTAMP(filter, lookup_fields, output_fields[, parameters])```
@@ -231,22 +259,25 @@ From unix timestamp function is used to convert the unix timestamp to date time
- lookup_fields: required
- output_fields: required
- parameters: optional
- - precision: `<String>` optional. Default is `seconds`. Enum: `milliseconds`, `seconds`.
+- precision: `<String>` optional. Default is `seconds`. Enum: `milliseconds`, `seconds`.
+
+#### Precision
-#### Precision
- `milliseconds` is used to convert the unix timestamp to milliseconds date time string. For example, `1619712000` will be converted to `2021-04-30 00:00:00.000`.
- `seconds` is used to convert the unix timestamp to seconds date time string. For example, `1619712000` will be converted to `2021-04-30 00:00:00`.
Example:
+
```yaml
- - function: FROM_UNIX_TIMESTAMP
- lookup_fields: [recv_time]
- output_fields: [recv_time_string]
- parameters:
- precision: seconds
+- function: FROM_UNIX_TIMESTAMP
+ lookup_fields: [recv_time]
+ output_fields: [recv_time_string]
+ parameters:
+ precision: seconds
```
### Generate String Array
+
Generate string array function is used to merge multiple fields into a string array. The merged field may be a string or a string array.
```GENERATE_STRING_ARRAY(filter, lookup_fields, output_fields)```
@@ -256,12 +287,15 @@ Generate string array function is used to merge multiple fields into a string ar
- parameters: not required
Example:
+
```yaml
- - function: GENERATE_STRING_ARRAY
- lookup_fields: [http_host, ssl_sni, quic_sni]
- output_fields: [server_domains]
+- function: GENERATE_STRING_ARRAY
+ lookup_fields: [http_host, ssl_sni, quic_sni]
+ output_fields: [server_domains]
```
+
### GeoIP Lookup
+
GeoIP lookup function is used to lookup the geoip information by ip address. You need to host the `.mmdb` database file from Knowledge Base Repository.
```GEOIP_LOOKUP(filter, lookup_fields, output_fields[, parameters])```
@@ -269,29 +303,31 @@ GeoIP lookup function is used to lookup the geoip information by ip address. You
- lookup_fields: required
- output_fields: optional
- parameters: required
- - kb_name: `<String>` required. The name of the knowledge base.
- - option: `<String>` required. Enum: `IP_TO_COUNTRY`, `IP_TO_PROVINCE`, `IP_TO_CITY`, `IP_TO_SUBDIVISION_ADDR`, `IP_TO_DETAIL`, `IP_TO_LATLNG`, `IP_TO_PROVIDER`, `IP_TO_JSON`, `IP_TO_OBJECT`.
- - geolocation_field_mapping : `<Map<String, String>>` optional. The option is required when the option is `IP_TO_OBJECT`. The mapping of the geolocation fields. The key is the field name of the knowledge base , and the value is the field name of the event.
- - COUNTRY: `<String>` optional.
- - PROVINCE: `<String>` optional.
- - CITY: `<String>` optional.
- - LONGITUDE: `<String>` optional.
- - LATITUDE: `<String>` optional.
- - ISP: `<String>` optional.
- - ORGANIZATION: `<String>` optional.
-
-#### Option
- - `IP_TO_COUNTRY` is used to lookup the country or region information by ip address.
- - `IP_TO_PROVINCE` is used to lookup the province or state information by ip address.
- - `IP_TO_CITY` is used to lookup the city information by ip address.
- - `IP_TO_SUBDIVISION_ADDR` is used to lookup the subdivision address information by ip address.
- - `IP_TO_DETAIL` is used to lookup the above four levels of information by ip address. It separated by `.`.
- - `IP_TO_LATLNG` is used to lookup the latitude and longitude information by ip address. It separated by `,`.
- - `IP_TO_PROVIDER` is used to lookup the provider information by ip address.
- - `IP_TO_JSON` is used to lookup the above information by ip address. The result is a json string.
- - `IP_TO_OBJECT` is used to lookup the above information by ip address. The result is a `LocationResponse` object.
-
-#### GeoLocation Field Mapping
+- kb_name: `<String>` required. The name of the knowledge base.
+- option: `<String>` required. Enum: `IP_TO_COUNTRY`, `IP_TO_PROVINCE`, `IP_TO_CITY`, `IP_TO_SUBDIVISION_ADDR`, `IP_TO_DETAIL`, `IP_TO_LATLNG`, `IP_TO_PROVIDER`, `IP_TO_JSON`, `IP_TO_OBJECT`.
+- geolocation_field_mapping : `<Map<String, String>>` optional. The option is required when the option is `IP_TO_OBJECT`. The mapping of the geolocation fields. The key is the field name of the knowledge base , and the value is the field name of the event.
+- COUNTRY: `<String>` optional.
+- PROVINCE: `<String>` optional.
+- CITY: `<String>` optional.
+- LONGITUDE: `<String>` optional.
+- LATITUDE: `<String>` optional.
+- ISP: `<String>` optional.
+- ORGANIZATION: `<String>` optional.
+
+#### Option
+
+- `IP_TO_COUNTRY` is used to lookup the country or region information by ip address.
+- `IP_TO_PROVINCE` is used to lookup the province or state information by ip address.
+- `IP_TO_CITY` is used to lookup the city information by ip address.
+- `IP_TO_SUBDIVISION_ADDR` is used to lookup the subdivision address information by ip address.
+- `IP_TO_DETAIL` is used to lookup the above four levels of information by ip address. It separated by `.`.
+- `IP_TO_LATLNG` is used to lookup the latitude and longitude information by ip address. It separated by `,`.
+- `IP_TO_PROVIDER` is used to lookup the provider information by ip address.
+- `IP_TO_JSON` is used to lookup the above information by ip address. The result is a json string.
+- `IP_TO_OBJECT` is used to lookup the above information by ip address. The result is a `LocationResponse` object.
+
+#### GeoLocation Field Mapping
+
- `COUNTRY` is used to map the country information to the event field.
- `PROVINCE` is used to map the province information to the event field.
- `CITY` is used to map the city information to the event field.
@@ -303,27 +339,29 @@ GeoIP lookup function is used to lookup the geoip information by ip address. You
Example:
```yaml
- - function: GEOIP_LOOKUP
- lookup_fields: [ client_ip ]
- output_fields: [ client_geolocation ]
- parameters:
- kb_name: tsg_ip_location
- option: IP_TO_DETAIL
+- function: GEOIP_LOOKUP
+ lookup_fields: [ client_ip ]
+ output_fields: [ client_geolocation ]
+ parameters:
+ kb_name: tsg_ip_location
+ option: IP_TO_DETAIL
```
+
```yaml
- - function: GEOIP_LOOKUP
- lookup_fields: [ server_ip ]
- output_fields: []
- parameters:
- kb_name: tsg_ip_location
- option: IP_TO_OBJECT
- geolocation_field_mapping:
- COUNTRY: server_country
- PROVINCE: server_super_administrative_area
- CITY: server_administrative_area
+- function: GEOIP_LOOKUP
+ lookup_fields: [ server_ip ]
+ output_fields: []
+ parameters:
+ kb_name: tsg_ip_location
+ option: IP_TO_OBJECT
+ geolocation_field_mapping:
+ COUNTRY: server_country
+ PROVINCE: server_super_administrative_area
+ CITY: server_administrative_area
```
### JSON Extract
+
JSON extract function is used to extract the value from json string.
```JSON_EXTRACT(filter, lookup_fields, output_fields[, parameters])```
@@ -331,16 +369,16 @@ JSON extract function is used to extract the value from json string.
- lookup_fields: required
- output_fields: required
- parameters: required
- - value_expression: `<String>` required. The json path expression.
+- value_expression: `<String>` required. The json path expression.
Example:
```yaml
- - function: JSON_EXTRACT
- lookup_fields: [ device_tag ]
- output_fields: [ device_group ]
- parameters:
- value_expression: $.tags[?(@.tag=='device_group')][0].value
+- function: JSON_EXTRACT
+ lookup_fields: [ device_tag ]
+ output_fields: [ device_group ]
+ parameters:
+ value_expression: $.tags[?(@.tag=='device_group')][0].value
```
### Path Combine
@@ -352,19 +390,20 @@ Path combine function is used to combine the file path. The path value can be co
- lookup_fields: required
- output_fields: required
- parameters: required
- - path: `<Array>` required.
+- path: `<Array>` required.
Example:
-
+
```yaml
- - function: PATH_COMBINE
- lookup_fields: [ packet_capture_file ]
- output_fields: [ packet_capture_file ]
- parameters:
- path: [ props.hos.path, props.hos.bucket.name.traffic_file, packet_capture_file ]
+- function: PATH_COMBINE
+ lookup_fields: [ packet_capture_file ]
+ output_fields: [ packet_capture_file ]
+ parameters:
+ path: [ props.hos.path, props.hos.bucket.name.traffic_file, packet_capture_file ]
```
### Rename
+
Rename function is used to rename or reformat(e.g. by replacing character underscores with dots) the field name.
```RENAME(filter, lookup_fields, output_fields, parameters)```
@@ -372,26 +411,27 @@ Rename function is used to rename or reformat(e.g. by replacing character unders
- lookup_fields: not required
- output_fields: not required
- parameters: required
- - parent_fields: `<Array>` optional. Specify fields whose children will inherit the Rename fields and Rename expression operations.
- - rename_fields: `Map<String, String>` required. The key is the original field name, and the value is the new field name.
- - current_field_name: `<String>` required. The original field name.
- - new_field_name: `<String>` required. The new field name.
- - rename_expression: `<String>` optional. AviatorScript expression whose returned value will be used to rename fields.
+- parent_fields: `<Array>` optional. Specify fields whose children will inherit the Rename fields and Rename expression operations.
+- rename_fields: `Map<String, String>` required. The key is the original field name, and the value is the new field name.
+- current_field_name: `<String>` required. The original field name.
+- new_field_name: `<String>` required. The new field name.
+- rename_expression: `<String>` optional. AviatorScript expression whose returned value will be used to rename fields.
```
A single Function can include both rename_fields (to rename specified field names) and rename_expression (to globally rename fields). However, the Rename fields strategy will execute first.
```
+
Example 1:
Remove the prefix "tags_" from the field names and rename the field "timestamp_ms" to "recv_time_ms".
```yaml
- - function: RENAME
- - parameters:
- rename_fields:
- - timestamp_ms: recv_time_ms
- rename_expression: key=string.replace_all(key,'tags_',''); return key;
-
+- function: RENAME
+- parameters:
+ rename_fields:
+ - timestamp_ms: recv_time_ms
+ rename_expression: key=string.replace_all(key,'tags_',''); return key;
+
```
Example 2:
@@ -399,15 +439,15 @@ Example 2:
Rename the field `client_ip` to `source_ip`, including the fields under the `encapsulation.ipv4` tunnel.
```yaml
- - function: RENAME
- - parameters:
- parent_fields: [encapsulation.ipv4]
- rename_fields:
- - client_ip: source_ip
-
+- function: RENAME
+- parameters:
+ parent_fields: [encapsulation.ipv4]
+ rename_fields:
+ - client_ip: source_ip
+
```
-Output: `source_ip:192.168.4.1, encapsulation.ipv4.source_ip:192.168.12.12`
+Output: `source_ip:192.168.4.1, encapsulation.ipv4.source_ip:192.168.12.12`
### Snowflake ID
@@ -422,14 +462,15 @@ Snowflake ID function is used to generate the snowflake id. The snowflake id is
- lookup_fields: not required
- output_fields: required
- parameters: optional
- - data_center_id_num: `<Integer>` optional. Default is `0`, range is `0-31`.
+- data_center_id_num: `<Integer>` optional. Default is `0`, range is `0-31`.
Example:
+
```yaml
- - function: SNOWFLAKE_ID
- output_fields: [log_id]
- parameters:
- data_center_id_num: 1
+- function: SNOWFLAKE_ID
+ output_fields: [log_id]
+ parameters:
+ data_center_id_num: 1
```
### String Joiner
@@ -441,41 +482,40 @@ String joiner function joins multiple string fields using a delimiter, prefix, a
- lookup_fields: required. Support more than one fields.
- output_fields: required
- parameters: optional
- - delimiter: `<String>` optional. Default is `,`.
- - prefix: `<String>` optional. Default is empty string.
- - suffix: `<String>` optional. Default is empty string.
+- delimiter: `<String>` optional. Default is `,`.
+- prefix: `<String>` optional. Default is empty string.
+- suffix: `<String>` optional. Default is empty string.
Example:
+
```yaml
- - function: STRING_JOINER
- lookup_fields: [http_host, ssl_sni, quic_sni]
- output_fields: [server_domains]
- parameters:
- delimiter: ','
- prefix: '['
- suffix: ']'
+- function: STRING_JOINER
+ lookup_fields: [http_host, ssl_sni, quic_sni]
+ output_fields: [server_domains]
+ parameters:
+ delimiter: ','
+ prefix: '['
+ suffix: ']'
```
### Unix Timestamp Converter
-Unix timestamp converter function is used to convert the unix timestamp precision.
+Unix timestamp converter function is used to convert the unix timestamp precision.
```UNIX_TIMESTAMP_CONVERTER(filter, lookup_fields, output_fields[, parameters])```
- filter: optional
- lookup_fields: required
- output_fields: required
- parameters: required
- - precision: `<String>` required. Enum: `milliseconds`, `seconds`, `minutes`. The minutes precision is used to generate Unix timestamp, round it to the minute level, and output it in seconds format.
- - Example:
+- precision: `<String>` required. Enum: `milliseconds`, `seconds`, `minutes`. The minutes precision is used to generate Unix timestamp, round it to the minute level, and output it in seconds format.
+- Example:
_`__timestamp` Internal field, from source ingestion time or current unix timestamp.
+
```yaml
- - function: UNIX_TIMESTAMP_CONVERTER
- lookup_fields: [__timestamp]
- output_fields: [recv_time]
- parameters:
- precision: seconds
+- function: UNIX_TIMESTAMP_CONVERTER
+ lookup_fields: [__timestamp]
+ output_fields: [recv_time]
+ parameters:
+ precision: seconds
```
-
-
-