diff options
| author | lifengchao <[email protected]> | 2024-03-13 10:55:17 +0800 |
|---|---|---|
| committer | lifengchao <[email protected]> | 2024-03-13 10:55:17 +0800 |
| commit | 420dc8afbc8721b343bf29af238a8578043e34a2 (patch) | |
| tree | 55f6184c76721ab3d668a7e35307be0a2dd96cee /docs/connector | |
| parent | 4e2a9108198a32d9ea491765dd8ddd1d1c163877 (diff) | |
[feature][connector] connector schema文档更新
Diffstat (limited to 'docs/connector')
| -rw-r--r-- | docs/connector/connector.md | 75 | ||||
| -rw-r--r-- | docs/connector/sink/clickhouse.md | 43 | ||||
| -rw-r--r-- | docs/connector/sink/kafka.md | 43 |
3 files changed, 107 insertions, 54 deletions
diff --git a/docs/connector/connector.md b/docs/connector/connector.md index 6df1e23..6bcc878 100644 --- a/docs/connector/connector.md +++ b/docs/connector/connector.md @@ -7,23 +7,68 @@ Source Connector contains some common core features, and each source connector s sources: ${source_name}: type: ${source_connector_type} - fields: - - name: ${field_name} - type: ${field_type} + # source table schema, config through fields or local_file or url + schema: + fields: + - name: ${field_name} + type: ${field_type} + # local_file: "schema path" + # url: "schema http url" properties: ${prop_key}: ${prop_value} ``` -| Name | Type | Required | Default | Description | -|-------------------------------------|-----------------------------------------------------------------------------|----------|--------------------------|--------------------------------------------------------------------------------------------------------------------------| -| type | String | Yes | - | The type of the source connector. The `SourceTableFactory` will use this value as identifier to create source connector. | -| fields | Array of `Field` | No | - | The structure of the data, including field names and field types. | -| properties | Map of String | Yes | - | The source connector customize properties, more details see the [Source](source) documentation. | +| Name | Type | Required | Default | Description | +|-------------------------------------|------------------------------------------------------------------|----------|--------------------------|--------------------------------------------------------------------------------------------------------------------------| +| type | String | Yes | - | The type of the source connector. The `SourceTableFactory` will use this value as identifier to create source connector. | +| schema | Map | No | - | The source table schema, config through fields or local_file or url. | +| properties | Map of String | Yes | - | The source connector customize properties, more details see the [Source](source) documentation. | ## Schema Field Projection The source connector supports reading only specified fields from the data source. For example `KafkaSource` will read all content from topic and then use `fields` to filter unnecessary columns. The Schema Structure refer to [Schema Structure](../user-guide.md#schema-structure). +## Schema Config +Schema can config through fields or local_file or url. + +### fields +```yaml +schema: + # by array + fields: + - name: ${field_name} + type: ${field_type} +``` + +```yaml +schema: + # by sql + fields: "struct<field_name:field_type, ...>" + # can also without outer struct<> + # fields: "field_name:field_type, ..." +``` + +### local_file + +```yaml +schema: + # by array + fields: + local_file: "schema path" +``` + +### url +Retrieve updated schema from URL for cycle, support dynamic schema. Not all connector support dynamic schema. + +The connectors that currently support dynamic schema include: clickHouse sink. + +```yaml +schema: + # by array + fields: + url: "schema http url" +``` + # Sink Connector Sink Connector contains some common core features, and each sink connector supports them to varying degrees. @@ -33,12 +78,18 @@ Sink Connector contains some common core features, and each sink connector suppo sinks: ${sink_name}: type: ${sink_connector_type} + # sink table schema, config through fields or local_file or url. if not set schema, all fields(Map<String, Object>) will be output. + schema: + fields: "struct<field_name:field_type, ...>" + # local_file: "schema path" + # url: "schema url" properties: ${prop_key}: ${prop_value} ``` -| Name | Type | Required | Default | Description | -|-------------------------------------|-----------------------------------------------------------------------------|----------|--------------------------|--------------------------------------------------------------------------------------------------------------------| -| type | String | Yes | - | The type of the sink connector. The `SinkTableFactory` will use this value as identifier to create sink connector. | -| properties | Map of String | Yes | - | The sink connector customize properties, more details see the [Sink](sink) documentation. | +| Name | Type | Required | Default | Description | +|-------------------------------------|--------------------------------------------------------------------|----------|--------------------------|--------------------------------------------------------------------------------------------------------------------| +| type | String | Yes | - | The type of the sink connector. The `SinkTableFactory` will use this value as identifier to create sink connector. | +| schema | Map | No | - | The sink table schema, config through fields or local_file or url. | +| properties | Map of String | Yes | - | The sink connector customize properties, more details see the [Sink](sink) documentation. | diff --git a/docs/connector/sink/clickhouse.md b/docs/connector/sink/clickhouse.md index d794767..ac37d24 100644 --- a/docs/connector/sink/clickhouse.md +++ b/docs/connector/sink/clickhouse.md @@ -50,27 +50,28 @@ This example read data of inline test source and write to ClickHouse table `test sources: # [object] Define connector source inline_source: type: inline - fields: # [array of object] Schema field projection, support read data only from specified fields. - - name: log_id - type: bigint - - name: recv_time - type: bigint - - name: server_fqdn - type: string - - name: server_domain - type: string - - name: client_ip - type: string - - name: server_ip - type: string - - name: server_asn - type: string - - name: decoded_as - type: string - - name: device_group - type: string - - name: device_tag - type: string + schema: + fields: # [array of object] Schema field projection, support read data only from specified fields. + - name: log_id + type: bigint + - name: recv_time + type: bigint + - name: server_fqdn + type: string + - name: server_domain + type: string + - name: client_ip + type: string + - name: server_ip + type: string + - name: server_asn + type: string + - name: decoded_as + type: string + - name: device_group + type: string + - name: device_tag + type: string properties: # # [string] Event Data, it will be parsed to Map<String, Object> by the specified format. diff --git a/docs/connector/sink/kafka.md b/docs/connector/sink/kafka.md index 6793b21..92976d8 100644 --- a/docs/connector/sink/kafka.md +++ b/docs/connector/sink/kafka.md @@ -26,27 +26,28 @@ This example read data of inline test source and write to kafka topic `SESSION-R sources: # [object] Define connector source inline_source: type: inline - fields: # [array of object] Schema field projection, support read data only from specified fields. - - name: log_id - type: bigint - - name: recv_time - type: bigint - - name: server_fqdn - type: string - - name: server_domain - type: string - - name: client_ip - type: string - - name: server_ip - type: string - - name: server_asn - type: string - - name: decoded_as - type: string - - name: device_group - type: string - - name: device_tag - type: string + schema: + fields: # [array of object] Schema field projection, support read data only from specified fields. + - name: log_id + type: bigint + - name: recv_time + type: bigint + - name: server_fqdn + type: string + - name: server_domain + type: string + - name: client_ip + type: string + - name: server_ip + type: string + - name: server_asn + type: string + - name: decoded_as + type: string + - name: device_group + type: string + - name: device_tag + type: string properties: # # [string] Event Data, it will be parsed to Map<String, Object> by the specified format. |
