summaryrefslogtreecommitdiff
path: root/docs/connector
diff options
context:
space:
mode:
author王宽 <[email protected]>2024-11-08 03:14:35 +0000
committer王宽 <[email protected]>2024-11-08 03:14:35 +0000
commitfc5cfd45a472784b8e21480639d6753e73b021f1 (patch)
tree5d637c0495c91239517efb8a7e7c0e98ead81a00 /docs/connector
parent7868728ddbe3dc08263b1d21b5ffce5dcd9b8052 (diff)
parent46475bc4b47a61a578086ed7720aa53ef24fe077 (diff)
Merge branch 'improve/uuidv5' into 'release/1.7.0'release/1.7.0
[Improve][Encrypt] Enhance Encrypt is applied to encryption at transit and... See merge request galaxy/platform/groot-stream!134
Diffstat (limited to 'docs/connector')
-rw-r--r--docs/connector/formats/csv.md11
-rw-r--r--docs/connector/sink/starrocks.md10
2 files changed, 10 insertions, 11 deletions
diff --git a/docs/connector/formats/csv.md b/docs/connector/formats/csv.md
index ca8d10b..76769b2 100644
--- a/docs/connector/formats/csv.md
+++ b/docs/connector/formats/csv.md
@@ -4,8 +4,7 @@
>
> ## Description
>
-> The CSV format allows to read and write CSV data based on an CSV schema. Currently, the CSV schema is derived from table schema.
-> **The CSV format must config schema for source/sink**.
+> The CSV format allows for reading and writing CSV data based on a schema. Currently, the CSV schema is derived from the table schema.
| Name | Supported Versions | Maven |
|--------------|--------------------|---------------------------------------------------------------------------------------------------------------------------|
@@ -16,12 +15,12 @@
| Name | Type | Required | Default | Description |
|-----------------------------|-----------|----------|---------|--------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| format | String | Yes | (none) | Specify what format to use, here should be 'csv'. |
-| csv.field.delimiter | String | No | , | Field delimiter character (',' by default), must be single character. You can use backslash to specify special characters, e.g. '\t' represents the tab character. |
-| csv.disable.quote.character | Boolean | No | false | Disabled quote character for enclosing field values (false by default). If true, option 'csv.quote.character' can not be set. |
-| csv.quote.character | String | No | " | Quote character for enclosing field values (" by default). |
+| csv.field.delimiter | String | No | , | Field delimiter character (`,` by default), must be single character. You can use backslash to specify special characters, e.g. '\t' represents the tab character. |
+| csv.disable.quote.character | Boolean | No | false | Disabled quote character for enclosing field values (`false` by default). If true, option `csv.quote.character` can not be set. |
+| csv.quote.character | String | No | " | Quote character for enclosing field values (`"` by default). |
| csv.allow.comments | Boolean | No | false | Ignore comment lines that start with '#' (disabled by default). If enabled, make sure to also ignore parse errors to allow empty rows. |
| csv.ignore.parse.errors | Boolean | No | false | Skip fields and rows with parse errors instead of failing. Fields are set to null in case of errors. |
-| csv.array.element.delimiter | String | No | ; | Array element delimiter string for separating array and row element values (';' by default). |
+| csv.array.element.delimiter | String | No | ; | Array element delimiter string for separating array and row element values (`;` by default). |
| csv.escape.character | String | No | (none) | Escape character for escaping values (disabled by default). |
| csv.null.literal | String | No | (none) | Null literal string that is interpreted as a null value (disabled by default). |
diff --git a/docs/connector/sink/starrocks.md b/docs/connector/sink/starrocks.md
index f07e432..208fa39 100644
--- a/docs/connector/sink/starrocks.md
+++ b/docs/connector/sink/starrocks.md
@@ -1,25 +1,25 @@
# Starrocks
-> Starrocks sink connector
+> StarRocks sink connector
>
> ## Description
>
-> Sink connector for Starrocks, know more in https://docs.starrocks.io/zh/docs/loading/Flink-connector-starrocks/.
+> Sink connector for StarRocks, know more in https://docs.starrocks.io/zh/docs/loading/Flink-connector-starrocks/.
## Sink Options
-Starrocks sink custom properties. If properties belongs to Starrocks Flink Connector Config, you can use `connection.` prefix to set.
+StarRocks sink custom properties. If properties belongs to StarRocks Flink Connector Config, you can use `connection.` prefix to set.
| Name | Type | Required | Default | Description |
|---------------------|---------|----------|---------|-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| log.failures.only | Boolean | No | true | Optional flag to whether the sink should fail on errors, or only log them; If this is set to true, then exceptions will be only logged, if set to false, exceptions will be eventually thrown, true by default. |
| connection.jdbc-url | String | Yes | (none) | The address that is used to connect to the MySQL server of the FE. You can specify multiple addresses, which must be separated by a comma (,). Format: jdbc:mysql://<fe_host1>:<fe_query_port1>,<fe_host2>:<fe_query_port2>,<fe_host3>:<fe_query_port3>.. |
| connection.load-url | String | Yes | (none) | The address that is used to connect to the HTTP server of the FE. You can specify multiple addresses, which must be separated by a semicolon (;). Format: <fe_host1>:<fe_http_port1>;<fe_host2>:<fe_http_port2>.. |
-| connection.config | Map | No | (none) | Starrocks Flink Connector Options, know more in https://docs.starrocks.io/docs/loading/Flink-connector-starrocks/#options. |
+| connection.config | Map | No | (none) | StarRocks Flink Connector Options, know more in https://docs.starrocks.io/docs/loading/Flink-connector-starrocks/#options. |
## Example
-This example read data of inline test source and write to Starrocks table `test`.
+This example read data of inline test source and write to StarRocks table `test`.
```yaml
sources: # [object] Define connector source