Skip to content

Commit e2c233f

Browse files
author
vikasrohit
authored
Merge pull request #7 from maxceem/develop
Kafka events with "action" type instead of "notification"
2 parents 3530f46 + 0c94e07 commit e2c233f

File tree

115 files changed

+149
-149
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

115 files changed

+149
-149
lines changed

README.md

Lines changed: 34 additions & 34 deletions
Original file line numberDiff line numberDiff line change
@@ -21,9 +21,9 @@ The following parameters can be set in config files or in env variables:
2121
- KAFKA_CLIENT_CERT_KEY: Kafka connection private key, optional; default value is undefined;
2222
if not provided, then SSL connection is not used, direct insecure connection is used;
2323
if provided, it can be either path to private key file or private key content
24-
- CREATE_DATA_TOPIC: create data Kafka topic, default value is 'project.notification.create'
25-
- UPDATE_DATA_TOPIC: update data Kafka topic, default value is 'project.notification.update'
26-
- DELETE_DATA_TOPIC: delete data Kafka topic, default value is 'project.notification.delete'
24+
- CREATE_DATA_TOPIC: create data Kafka topic, default value is 'project.action.create'
25+
- UPDATE_DATA_TOPIC: update data Kafka topic, default value is 'project.action.update'
26+
- DELETE_DATA_TOPIC: delete data Kafka topic, default value is 'project.action.delete'
2727
- KAFKA_MESSAGE_ORIGINATOR: Kafka topic originator, default value is 'project-api'
2828
- esConfig: config object for Elasticsearch
2929

@@ -41,27 +41,27 @@ Config for tests are at `config/test.js`, it overrides some default config.
4141
- Download kafka at `https://www.apache.org/dyn/closer.cgi?path=/kafka/1.1.0/kafka_2.11-1.1.0.tgz`
4242
- Extract out the doanlowded tgz file
4343
- Go to extracted directory kafka_2.11-0.11.0.1
44-
- Start ZooKeeper server:
44+
- Start ZooKeeper server:
4545
`bin/zookeeper-server-start.sh config/zookeeper.properties`
46-
- Use another terminal, go to same directory, start the Kafka server:
46+
- Use another terminal, go to same directory, start the Kafka server:
4747
`bin/kafka-server-start.sh config/server.properties`
4848
- Note that the zookeeper server is at localhost:2181, and Kafka server is at localhost:9092
49-
- Use another terminal, go to same directory, create some topics:
50-
`bin/kafka-topics.sh --create --zookeeper localhost:2181 --replication-factor 1 --partitions 1 --topic project.notification.create`
51-
`bin/kafka-topics.sh --create --zookeeper localhost:2181 --replication-factor 1 --partitions 1 --topic project.notification.update`
52-
`bin/kafka-topics.sh --create --zookeeper localhost:2181 --replication-factor 1 --partitions 1 --topic project.notification.delete`
53-
- Verify that the topics are created:
49+
- Use another terminal, go to same directory, create some topics:
50+
`bin/kafka-topics.sh --create --zookeeper localhost:2181 --replication-factor 1 --partitions 1 --topic project.action.create`
51+
`bin/kafka-topics.sh --create --zookeeper localhost:2181 --replication-factor 1 --partitions 1 --topic project.action.update`
52+
`bin/kafka-topics.sh --create --zookeeper localhost:2181 --replication-factor 1 --partitions 1 --topic project.action.delete`
53+
- Verify that the topics are created:
5454
`bin/kafka-topics.sh --list --zookeeper localhost:2181`,
5555
it should list out the created topics
56-
- run the producer and then write some message into the console to send to the `project.notification.create` topic:
57-
`bin/kafka-console-producer.sh --broker-list localhost:9092 --topic project.notification.create`
58-
in the console, write message, one message per line:
59-
`{"topic":"project.notification.create","originator":"project-api","timestamp":"2019-06-20T13:43:25.817Z","mime-type":"application/json","payload":{"resource":"project","createdAt":"2019-06-20T13:43:23.554Z","updatedAt":"2019-06-20T13:43:23.555Z","terms":[],"id":1,"name":"test project","description":"Hello I am a test project","type":"app","createdBy":40051333,"updatedBy":40051333,"projectEligibility":[],"bookmarks":[],"external":null,"status":"draft","lastActivityAt":"2019-06-20T13:43:23.514Z","lastActivityUserId":"40051333","members":[{"createdAt":"2019-06-20T13:43:23.555Z","updatedAt":"2019-06-20T13:43:23.625Z","id":2,"isPrimary":true,"role":"manager","userId":40051333,"updatedBy":40051333,"createdBy":40051333,"projectId":2,"deletedAt":null,"deletedBy":null}],"version":"v2","directProjectId":null,"billingAccountId":null,"estimatedPrice":null,"actualPrice":null,"details":null,"cancelReason":null,"templateId":null,"deletedBy":null,"attachments":null,"phases":null,"projectUrl":"https://connect.topcoder-dev.com/projects/2"}}`
60-
- Optionally, use another terminal, go to same directory, start a consumer to view the messages:
61-
`bin/kafka-console-consumer.sh --bootstrap-server localhost:9092 --topic project.notification.create --from-beginning`
62-
- If the kafka don't allow to input long message you can use this script to write message from file:
63-
`path_to_kafka/bin/kafka-console-producer.sh --broker-list localhost:9092 --topic project.notification.create < our_project_root_directory/test/data/project/project.notification.create.json`
64-
- Writing/reading messages to/from other topics are similar. All example for messages are in:
56+
- run the producer and then write some message into the console to send to the `project.action.create` topic:
57+
`bin/kafka-console-producer.sh --broker-list localhost:9092 --topic project.action.create`
58+
in the console, write message, one message per line:
59+
`{"topic":"project.action.create","originator":"project-api","timestamp":"2019-06-20T13:43:25.817Z","mime-type":"application/json","payload":{"resource":"project","createdAt":"2019-06-20T13:43:23.554Z","updatedAt":"2019-06-20T13:43:23.555Z","terms":[],"id":1,"name":"test project","description":"Hello I am a test project","type":"app","createdBy":40051333,"updatedBy":40051333,"projectEligibility":[],"bookmarks":[],"external":null,"status":"draft","lastActivityAt":"2019-06-20T13:43:23.514Z","lastActivityUserId":"40051333","members":[{"createdAt":"2019-06-20T13:43:23.555Z","updatedAt":"2019-06-20T13:43:23.625Z","id":2,"isPrimary":true,"role":"manager","userId":40051333,"updatedBy":40051333,"createdBy":40051333,"projectId":2,"deletedAt":null,"deletedBy":null}],"version":"v2","directProjectId":null,"billingAccountId":null,"estimatedPrice":null,"actualPrice":null,"details":null,"cancelReason":null,"templateId":null,"deletedBy":null,"attachments":null,"phases":null,"projectUrl":"https://connect.topcoder-dev.com/projects/2"}}`
60+
- Optionally, use another terminal, go to same directory, start a consumer to view the messages:
61+
`bin/kafka-console-consumer.sh --bootstrap-server localhost:9092 --topic project.action.create --from-beginning`
62+
- If the kafka don't allow to input long message you can use this script to write message from file:
63+
`path_to_kafka/bin/kafka-console-producer.sh --broker-list localhost:9092 --topic project.action.create < our_project_root_directory/test/data/project/project.action.create.json`
64+
- Writing/reading messages to/from other topics are similar. All example for messages are in:
6565
`our_project_root_directory/test/data`
6666

6767
## Local Elasticsearch setup
@@ -124,8 +124,8 @@ npm run test:cov
124124
- Call extracted directory kafka_2.11-0.11.0.1 : `path_to_kafka`
125125
- Call our project root directory : `our_project_root_directory`
126126
- Start kafka server, start elasticsearch, initialize Elasticsearch, start processor app
127-
- Send message:
128-
`path_to_kafka/bin/kafka-console-producer.sh --broker-list localhost:9092 --topic project.notification.create < our_project_root_directory/test/data/project/project.notification.create.json`
127+
- Send message:
128+
`path_to_kafka/bin/kafka-console-producer.sh --broker-list localhost:9092 --topic project.action.create < our_project_root_directory/test/data/project/project.action.create.json`
129129
- run command `npm run view-data projects 1` to view the created data, you will see the data are properly created:
130130

131131
```bash
@@ -177,18 +177,18 @@ info: {
177177
```
178178

179179

180-
- Run the producer and then write some invalid message into the console to send to the `project.notification.create` topic:
181-
`bin/kafka-console-producer.sh --broker-list localhost:9092 --topic project.notification.create`
182-
in the console, write message, one message per line:
183-
`{ "topic": "project.notification.create", "originator": "project-api", "timestamp": "2019-02-16T00:00:00", "mime-type": "application/json", "payload": { "id": "invalid", "typeId": "8e17090c-465b-4c17-b6d9-dfa16300b0ff", "track": "Code", "name": "test", "description": "desc", "timelineTemplateId": "8e17090c-465b-4c17-b6d9-dfa16300b0aa", "phases": [{ "id": "8e17090c-465b-4c17-b6d9-dfa16300b012", "name": "review", "isActive": true, "duration": 10000 }], "prizeSets": [{ "type": "prize", "prizes": [{ "type": "winning prize", "value": 500 }] }], "reviewType": "code review", "tags": ["code"], "projectId": 123, "forumId": 456, "status": "Active", "created": "2019-02-16T00:00:00", "createdBy": "admin" } }`
180+
- Run the producer and then write some invalid message into the console to send to the `project.action.create` topic:
181+
`bin/kafka-console-producer.sh --broker-list localhost:9092 --topic project.action.create`
182+
in the console, write message, one message per line:
183+
`{ "topic": "project.action.create", "originator": "project-api", "timestamp": "2019-02-16T00:00:00", "mime-type": "application/json", "payload": { "id": "invalid", "typeId": "8e17090c-465b-4c17-b6d9-dfa16300b0ff", "track": "Code", "name": "test", "description": "desc", "timelineTemplateId": "8e17090c-465b-4c17-b6d9-dfa16300b0aa", "phases": [{ "id": "8e17090c-465b-4c17-b6d9-dfa16300b012", "name": "review", "isActive": true, "duration": 10000 }], "prizeSets": [{ "type": "prize", "prizes": [{ "type": "winning prize", "value": 500 }] }], "reviewType": "code review", "tags": ["code"], "projectId": 123, "forumId": 456, "status": "Active", "created": "2019-02-16T00:00:00", "createdBy": "admin" } }`
184184

185-
`{ "topic": "project.notification.create", "originator": "project-api", "timestamp": "2019-02-16T00:00:00", "mime-type": "application/json", "payload": { "id": "173803d3-019e-4033-b1cf-d7205c7f774c", "typeId": "8e17090c-465b-4c17-b6d9-dfa16300b0ff", "track": "Code", "name": "test", "description": "desc", "timelineTemplateId": "8e17090c-465b-4c17-b6d9-dfa16300b0aa", "phases": [{ "id": "8e17090c-465b-4c17-b6d9-dfa16300b012", "name": "review", "isActive": true, "duration": 10000 }], "prizeSets": [{ "type": "prize", "prizes": [{ "type": "winning prize", "value": 500 }] }], "reviewType": "code review", "tags": ["code"], "projectId": 123, "forumId": -456, "status": "Active", "created": "2018-01-02T00:00:00", "createdBy": "admin" } }`
185+
`{ "topic": "project.action.create", "originator": "project-api", "timestamp": "2019-02-16T00:00:00", "mime-type": "application/json", "payload": { "id": "173803d3-019e-4033-b1cf-d7205c7f774c", "typeId": "8e17090c-465b-4c17-b6d9-dfa16300b0ff", "track": "Code", "name": "test", "description": "desc", "timelineTemplateId": "8e17090c-465b-4c17-b6d9-dfa16300b0aa", "phases": [{ "id": "8e17090c-465b-4c17-b6d9-dfa16300b012", "name": "review", "isActive": true, "duration": 10000 }], "prizeSets": [{ "type": "prize", "prizes": [{ "type": "winning prize", "value": 500 }] }], "reviewType": "code review", "tags": ["code"], "projectId": 123, "forumId": -456, "status": "Active", "created": "2018-01-02T00:00:00", "createdBy": "admin" } }`
186186

187187
`{ [ { abc`
188188
- Then in the app console, you will see error messages
189189

190-
- Sent message to update data:
191-
`path_to_kafka/bin/kafka-console-producer.sh --broker-list localhost:9092 --topic project.notification.update < our_project_root_directory/test/data/project/project.notification.update.json`
190+
- Sent message to update data:
191+
`path_to_kafka/bin/kafka-console-producer.sh --broker-list localhost:9092 --topic project.action.update < our_project_root_directory/test/data/project/project.action.update.json`
192192
- Run command `npm run view-data projects 1` to view the updated data, you will see the data are properly updated:
193193

194194
```bash
@@ -242,15 +242,15 @@ info: {
242242
```
243243

244244

245-
- Run the producer and then write some invalid message into the console to send to the `project.notification.create` topic:
246-
`bin/kafka-console-producer.sh --broker-list localhost:9092 --topic project.notification.create`
247-
in the console, write message, one message per line:
248-
`{ "topic": "project.notification.update", "originator": "project-api", "timestamp": "2019-02-17T01:00:00", "mime-type": "application/json", "payload": { "id": "173803d3-019e-4033-b1cf-d7205c7f774c", "typeId": "123", "track": "Code", "name": "test3", "description": "desc3", "timelineTemplateId": "8e17090c-465b-4c17-b6d9-dfa16300b0dd", "groups": ["group2", "group3"], "updated": "2019-02-17T01:00:00", "updatedBy": "admin" } }`
245+
- Run the producer and then write some invalid message into the console to send to the `project.action.create` topic:
246+
`bin/kafka-console-producer.sh --broker-list localhost:9092 --topic project.action.create`
247+
in the console, write message, one message per line:
248+
`{ "topic": "project.action.update", "originator": "project-api", "timestamp": "2019-02-17T01:00:00", "mime-type": "application/json", "payload": { "id": "173803d3-019e-4033-b1cf-d7205c7f774c", "typeId": "123", "track": "Code", "name": "test3", "description": "desc3", "timelineTemplateId": "8e17090c-465b-4c17-b6d9-dfa16300b0dd", "groups": ["group2", "group3"], "updated": "2019-02-17T01:00:00", "updatedBy": "admin" } }`
249249

250-
`{ "topic": "project.notification.update", "originator": "project-api", "timestamp": "2019-02-17T01:00:00", "mime-type": "application/json", "payload": { "id": "173803d3-019e-4033-b1cf-d7205c7f774c", "typeId": "8e17090c-465b-4c17-b6d9-dfa16300b0ff", "track": ["Code"], "name": "test3", "description": "desc3", "timelineTemplateId": "8e17090c-465b-4c17-b6d9-dfa16300b0dd", "groups": ["group2", "group3"], "updated": "2019-02-17T01:00:00", "updatedBy": "admin" } }`
250+
`{ "topic": "project.action.update", "originator": "project-api", "timestamp": "2019-02-17T01:00:00", "mime-type": "application/json", "payload": { "id": "173803d3-019e-4033-b1cf-d7205c7f774c", "typeId": "8e17090c-465b-4c17-b6d9-dfa16300b0ff", "track": ["Code"], "name": "test3", "description": "desc3", "timelineTemplateId": "8e17090c-465b-4c17-b6d9-dfa16300b0dd", "groups": ["group2", "group3"], "updated": "2019-02-17T01:00:00", "updatedBy": "admin" } }`
251251

252252
`[ [ [ } } }`
253-
253+
254254
- Then in the app console, you will see error messages
255255

256256
- To test the health check API, run `export PORT=5000`, start the processor, then browse `http://localhost:5000/health` in a browser,

config/default.js

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -12,9 +12,9 @@ module.exports = {
1212
KAFKA_CLIENT_CERT: process.env.KAFKA_CLIENT_CERT,
1313
KAFKA_CLIENT_CERT_KEY: process.env.KAFKA_CLIENT_CERT_KEY,
1414

15-
CREATE_DATA_TOPIC: process.env.CREATE_DATA_TOPIC || 'project.notification.create',
16-
UPDATE_DATA_TOPIC: process.env.UPDATE_DATA_TOPIC || 'project.notification.update',
17-
DELETE_DATA_TOPIC: process.env.DELETE_DATA_TOPIC || 'project.notification.delete',
15+
CREATE_DATA_TOPIC: process.env.CREATE_DATA_TOPIC || 'project.action.create',
16+
UPDATE_DATA_TOPIC: process.env.UPDATE_DATA_TOPIC || 'project.action.update',
17+
DELETE_DATA_TOPIC: process.env.DELETE_DATA_TOPIC || 'project.action.delete',
1818
KAFKA_MESSAGE_ORIGINATOR: process.env.KAFKA_MESSAGE_ORIGINATOR || 'project-api',
1919

2020
esConfig: {

0 commit comments

Comments
 (0)