Skip to content

Commit

Permalink
[ISSUE #204] Add Kafka connector docs (#205)
Browse files Browse the repository at this point in the history
* add kafka connector doc

* modify punctuation
  • Loading branch information
Ish1yu authored Mar 12, 2024
1 parent 8182f9b commit 55dc108
Show file tree
Hide file tree
Showing 4 changed files with 119 additions and 7 deletions.
8 changes: 4 additions & 4 deletions docs/design-document/03-connect/10-file-connector.md
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@
4. Using the Topic specified in `pubSubConfig.subject`, send a message to EventMesh, which you will persist in the file.

```yaml
# public config
# Common configuration
pubSubConfig:
meshAddress: 127.0.0.1:10000
subject: TopicTest
Expand All @@ -29,10 +29,10 @@ connectorConfig:
1. Start your EventMesh Runtime.
2. Enable sinkConnector and check `source-config.yml`.
3. Started FileConnectServer,It sends the data read from `connectorConfig.filePath` to `pubSubConfig.subject` in the EventMesh Runtime.
4. The append to the file content is recognized, and you receive the message in EventMesh
4. The append to the file content is recognized, and you receive the message in EventMesh.

```yaml
# public config
# Common configuration
pubSubConfig:
meshAddress: 127.0.0.1:10000
subject: TopicTest
Expand All @@ -48,4 +48,4 @@ connectorConfig:
filePath: userFilePath
```

> Special note: System.in and System.out are used if the source file or import file cannot be retrieved
> Special note: System.in and System.out are used if the source file or import file cannot be retrieved.
56 changes: 56 additions & 0 deletions docs/design-document/03-connect/11-kafka-connector.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,56 @@
# Kafka

## KafkaSinkConnector:From EventMesh to Kafka

1. Start your EventMesh Runtime.
2. Enable sinkConnector and check `sink-config.yml`.
3. Start your KafkaConnectServer, which will subscribe to the topic defined in `pubSubConfig.subject` in the EventMesh Runtime, The received data is published to `connectorConfig.topic` in Kafka.
4. Using the Topic specified in `pubSubConfig.subject`, send a message to EventMesh, which you will then see in Kafka.

```yaml
# Common configuration
pubSubConfig:
meshAddress: 127.0.0.1:10000
subject: TopicTest
idc: FT
env: PRD
group: kafkaSink
appId: 5031
userName: kafkaSinkUser
passWord: kafkaPassWord
connectorConfig:
connectorName: kafkaSink
# kafka connection parameters ↓
bootstrapServers: 127.0.0.1:9092
topic: TopicTest
keyConverter: org.apache.kafka.common.serialization.StringSerializer
valueConverter: org.apache.kafka.common.serialization.StringSerializer
```
## KafkaSourceConnector:From Kafka to EventMesh
1. Start your EventMesh Runtime.
2. Enable sourceConnector and check `source-config.yml`.
3. Start your KafkaConnectServer, which will subscribe to Kafka's `connectorConfig.topic` and send the read data to `pubSubConfig.subject` in the EventMesh Runtime.
4. Send a message to Kafka, and you'll receive it in EventMesh.

```yaml
# Common configuration
pubSubConfig:
meshAddress: 127.0.0.1:10000
subject: TopicTest
idc: FT
env: PRD
group: kafkaSource
appId: 5032
userName: kafkaSourceUser
passWord: kafkaPassWord
connectorConfig:
connectorName: kafkaSource
# kafka connection parameters ↓
bootstrapServers: 127.0.0.1:9092
topic: TopicTest
groupID: kafkaSource
sessionTimeoutMS: 10000
maxPollRecords: 1000
```
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@

1. 启动你的 EventMesh Runtime。
2. 启用 sinkConnector 并检查 `sink-config.yml`
3. 启动你的 FileConnectServer,它将订阅到 EventMesh Runtime 中 `pubSubConfig.subject` 中定义的主题,并将数据写入到 路径位于: `connectorConfig.topic`/年/月/日 ;名为: 【 `connectorConfig.topic` + 当前时间小时位(24小时制) + 时间戳 】的文件
3. 启动你的 FileConnectServer,它将订阅到 EventMesh Runtime 中 `pubSubConfig.subject` 中定义的主题,并将数据写入到 路径位于: `connectorConfig.topic`/年/月/日 ;名为: 【 `connectorConfig.topic` + 当前时间小时位(24小时制) + 时间戳 】的文件
4. 使用在 `pubSubConfig.subject` 中指定的 Topic,向 EventMesh 发送消息,然后你将在 文件 中持久化该消息。

```yaml
Expand All @@ -29,7 +29,7 @@ connectorConfig:
1. 启动你的 EventMesh Runtime。
2. 启用 sourceConnector 并检查 `source-config.yml`。
3. 启动你的 FileConnectServer,它将从 `connectorConfig.filePath `中读取的数据发送到 EventMesh Runtime 中的 `pubSubConfig.subject`。
4. 文件内容的 追加操作 会被识别,然后你将在 EventMesh 中接收到该消息
4. 文件内容的 追加操作 会被识别,然后你将在 EventMesh 中接收到该消息

```yaml
# 公共配置
Expand All @@ -48,4 +48,4 @@ connectorConfig:
filePath: userFilePath
```

> 特殊说明:如果没能获取源文件或汇入文件,则使用 System.in 和 System.out
> 特殊说明:如果没能获取源文件或汇入文件,则使用 System.in 和 System.out
Original file line number Diff line number Diff line change
@@ -0,0 +1,56 @@
# Kafka

## KafkaSinkConnector:从 EventMesh 到 Kafka

1. 启动你的 EventMesh Runtime。
2. 启用 sinkConnector 并检查 `sink-config.yml`
3. 启动你的 KafkaConnectServer,它将订阅到 EventMesh Runtime 中 `pubSubConfig.subject` 中定义的主题,并将收到的数据发布到 Kafka 中的 `connectorConfig.topic`
4. 使用在 `pubSubConfig.subject` 中指定的 Topic,向 EventMesh 发送消息,然后你将在 Kafka 看到该消息。

```yaml
# 公共配置
pubSubConfig:
meshAddress: 127.0.0.1:10000
subject: TopicTest
idc: FT
env: PRD
group: kafkaSink
appId: 5031
userName: kafkaSinkUser
passWord: kafkaPassWord
connectorConfig:
connectorName: kafkaSink
# kafka连接参数 ↓
bootstrapServers: 127.0.0.1:9092
topic: TopicTest
keyConverter: org.apache.kafka.common.serialization.StringSerializer
valueConverter: org.apache.kafka.common.serialization.StringSerializer
```
## KafkaSourceConnector:从 Kafka 到 EventMesh
1. 启动你的 EventMesh Runtime。
2. 启用 sourceConnector 并检查 `source-config.yml`。
3. 启动你的 KafkaConnectServer,它将订阅 Kafka的 `connectorConfig.topic `,并将读取的数据发送到 EventMesh Runtime 中的 `pubSubConfig.subject`。
4. 向 Kafka 发送一个消息,然后你将在 EventMesh 中接收到该消息。

```yaml
# 公共配置
pubSubConfig:
meshAddress: 127.0.0.1:10000
subject: TopicTest
idc: FT
env: PRD
group: kafkaSource
appId: 5032
userName: kafkaSourceUser
passWord: kafkaPassWord
connectorConfig:
connectorName: kafkaSource
# kafka连接参数 ↓
bootstrapServers: 127.0.0.1:9092
topic: TopicTest
groupID: kafkaSource
sessionTimeoutMS: 10000
maxPollRecords: 1000
```

0 comments on commit 55dc108

Please sign in to comment.