Skip to content

Commit 88fb0fe

Browse files
Winning submission of kafka and ES integration contest
0 parents  commit 88fb0fe

File tree

14 files changed

+3525
-0
lines changed

14 files changed

+3525
-0
lines changed

.gitignore

Lines changed: 5 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,5 @@
1+
.idea
2+
node_modules
3+
*.log
4+
.DS_Store
5+
.env

README.md

Lines changed: 146 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,146 @@
1+
# Topcoder - Submission Processor
2+
3+
## Dependencies
4+
5+
- nodejs https://nodejs.org/en/ (v8+)
6+
- Kafka
7+
- ElasticSearch
8+
- Docker, Docker Compose
9+
10+
## Notes
11+
12+
In Elasticsearch v6+, it can support one type per index, see
13+
https://www.elastic.co/guide/en/elasticsearch/reference/current/removal-of-types.html
14+
so for all 4 types, same configured index type is used, types are distinguished by the 'resource' attribute of indexed data.
15+
16+
## Configuration
17+
18+
Configuration for the notification server is at `config/default.js`.
19+
The following parameters can be set in config files or in env variables:
20+
21+
- LOG_LEVEL: the log level; default value: 'debug'
22+
- KAFKA_URL: comma separated Kafka hosts; default value: 'localhost:9092'
23+
- KAFKA_CLIENT_CERT: Kafka connection certificate, optional; default value is undefined;
24+
if not provided, then SSL connection is not used, direct insecure connection is used;
25+
if provided, it can be either path to certificate file or certificate content
26+
- KAFKA_CLIENT_CERT_KEY: Kafka connection private key, optional; default value is undefined;
27+
if not provided, then SSL connection is not used, direct insecure connection is used;
28+
if provided, it can be either path to private key file or private key content
29+
- CREATE_DATA_TOPIC: create data Kafka topic, default value is 'submission.notification.create'
30+
- UPDATE_DATA_TOPIC: update data Kafka topic, default value is 'submission.notification.update'
31+
- DELETE_DATA_TOPIC: delete data Kafka topic, default value is 'submission.notification.delete'
32+
- ELASTICSEARCH_CONFIG: the config to create Elasticsearch client, see
33+
https://www.elastic.co/guide/en/elasticsearch/client/javascript-api/current/configuration.html
34+
for full details, SSL connection can be configured in ssl config field
35+
- ELASTICSEARCH_INDEX: the Elasticsearch index to store Kafka messages data, default value is 'submission-index'
36+
- ELASTICSEARCH_INDEX_TYPE: the Elasticsearch index type name, default value is 'submission'
37+
38+
## Local Kafka setup
39+
40+
- `http://kafka.apache.org/quickstart` contains details to setup and manage Kafka server,
41+
below provides details to setup Kafka server in Mac, Windows will use bat commands in bin/windows instead
42+
- download kafka at `https://www.apache.org/dyn/closer.cgi?path=/kafka/1.1.0/kafka_2.11-1.1.0.tgz`
43+
- extract out the doanlowded tgz file
44+
- go to extracted directory kafka_2.11-0.11.0.1
45+
- start ZooKeeper server:
46+
`bin/zookeeper-server-start.sh config/zookeeper.properties`
47+
- use another terminal, go to same directory, start the Kafka server:
48+
`bin/kafka-server-start.sh config/server.properties`
49+
- note that the zookeeper server is at localhost:2181, and Kafka server is at localhost:9092
50+
- use another terminal, go to same directory, create some topics:
51+
`bin/kafka-topics.sh --create --zookeeper localhost:2181 --replication-factor 1 --partitions 1 --topic submission.notification.create`
52+
`bin/kafka-topics.sh --create --zookeeper localhost:2181 --replication-factor 1 --partitions 1 --topic submission.notification.update`
53+
`bin/kafka-topics.sh --create --zookeeper localhost:2181 --replication-factor 1 --partitions 1 --topic submission.notification.delete`
54+
- verify that the topics are created:
55+
`bin/kafka-topics.sh --list --zookeeper localhost:2181`,
56+
it should list out the created topics
57+
- run the producer and then write some message into the console to send to the `submission.notification.create` topic:
58+
`bin/kafka-console-producer.sh --broker-list localhost:9092 --topic submission.notification.create`
59+
in the console, write message, one message per line:
60+
`{ "topic": "submission.notification.create", "originator": "submission-api", "timestamp": "2018-02-16T00:00:00", "mime-type": "application/json", "payload": { "resource": "submission", "id": "1111", "type": "ContestSubmission", "url": "http://test.com/path", "memberId": "aaaa", "challengeId": "bbbb", "created": "2018-01-02T00:00:00", "updated": "2018-02-03T00:00:00", "createdBy": "admin", "updatedBy": "user" } }`
61+
- optionally, use another terminal, go to same directory, start a consumer to view the messages:
62+
`bin/kafka-console-consumer.sh --bootstrap-server localhost:9092 --topic submission.notification.create --from-beginning`
63+
- writing/reading messages to/from other topics are similar
64+
65+
## ElasticSearch setup
66+
67+
- go to docker folder, run `docker-compose up`
68+
69+
## Local deployment
70+
71+
- install dependencies `npm i`
72+
- run code lint check `npm run lint`, running `npm run lint-fix` can fix some lint errors if any
73+
- initialize Elasticsearch, create configured Elasticsearch index if not present: `npm run init-es`
74+
- or to re-create the index: `npm run init-es force`
75+
- run tests `npm run test`
76+
- start processor app `npm start`
77+
78+
## Verification
79+
80+
- start kafka server, start elasticsearch, initialize Elasticsearch, start processor app
81+
- start kafka-console-producer to write messages to `submission.notification.create` topic:
82+
`bin/kafka-console-producer.sh --broker-list localhost:9092 --topic submission.notification.create`
83+
- write message:
84+
`{ "topic": "submission.notification.create", "originator": "submission-api", "timestamp": "2018-02-16T00:00:00", "mime-type": "application/json", "payload": { "resource": "submission", "id": "1111", "type": "ContestSubmission", "url": "http://test.com/path", "memberId": "aaaa", "challengeId": "bbbb", "created": "2018-01-02T00:00:00", "updated": "2018-02-03T00:00:00", "createdBy": "admin", "updatedBy": "user" } }`
85+
- run command `npm run view-data 1111` to view the created data, you will see the data are properly created:
86+
87+
```bash
88+
info: Elasticsearch data:
89+
info: {
90+
"resource": "submission",
91+
"id": "1111",
92+
"type": "ContestSubmission",
93+
"url": "http://test.com/path",
94+
"memberId": "aaaa",
95+
"challengeId": "bbbb",
96+
"created": "2018-01-02T00:00:00",
97+
"updated": "2018-02-03T00:00:00",
98+
"createdBy": "admin",
99+
"updatedBy": "user"
100+
}
101+
```
102+
103+
- you may write invalid message like:
104+
`{ "topic": "submission.notification.create", "originator": "submission-api", "timestamp": "2018-02-16T00:00:00", "mime-type": "application/json", "payload": { "resource": "submission", "type": "ContestSubmission", "url": "http://test.com/path", "memberId": "aaaa", "challengeId": "bbbb", "created": "2018-01-02T00:00:00", "updated": "2018-02-03T00:00:00", "createdBy": "admin", "updatedBy": "user" } }`
105+
- then in the app console, you will see error message
106+
107+
- start kafka-console-producer to write messages to `submission.notification.update` topic:
108+
`bin/kafka-console-producer.sh --broker-list localhost:9092 --topic submission.notification.update`
109+
- write message:
110+
`{ "topic": "submission.notification.update", "originator": "submission-api", "timestamp": "2018-02-16T00:00:00", "mime-type": "application/json", "payload": { "resource": "submission", "id": "1111", "type": "ContestSubmission2", "url": "http://test.com/path2", "memberId": "aaaa2", "challengeId": "bbbb2", "created": "2018-03-03T00:00:00", "updated": "2018-04-04T00:00:00", "createdBy": "admin2", "updatedBy": "user2" } }`
111+
- run command `npm run view-data 1111` to view the created data, you will see the data are properly updated:
112+
113+
```bash
114+
info: Elasticsearch data:
115+
info: {
116+
"resource": "submission",
117+
"id": "1111",
118+
"type": "ContestSubmission2",
119+
"url": "http://test.com/path2",
120+
"memberId": "aaaa2",
121+
"challengeId": "bbbb2",
122+
"created": "2018-03-03T00:00:00",
123+
"updated": "2018-04-04T00:00:00",
124+
"createdBy": "admin2",
125+
"updatedBy": "user2"
126+
}
127+
```
128+
129+
- start kafka-console-producer to write messages to `submission.notification.delete` topic:
130+
`bin/kafka-console-producer.sh --broker-list localhost:9092 --topic submission.notification.delete`
131+
- write message:
132+
`{ "topic": "submission.notification.delete", "originator": "submission-api", "timestamp": "2018-02-16T00:00:00", "mime-type": "application/json", "payload": { "resource": "submission", "id": "1111", "type": "ContestSubmission2", "url": "http://test.com/path2", "memberId": "aaaa2", "challengeId": "bbbb2", "created": "2018-03-03T00:00:00", "updated": "2018-04-04T00:00:00", "createdBy": "admin2", "updatedBy": "user2" } }`
133+
- run command `npm run view-data 1111` to view the created data, you will see the data are properly deleted:
134+
135+
```bash
136+
info: The data is not found.
137+
```
138+
139+
- management of other resource types are similar, below gives valid Kafka messages for other resource types, so that you may test them easily,
140+
you may need to update their topic to `submission.notification.create`, `submission.notification.update` or `submission.notification.delete`
141+
- review:
142+
`{ "topic": "submission.notification.create", "originator": "submission-api", "timestamp": "2018-02-16T00:00:00", "mime-type": "application/json", "payload": { "resource": "review", "id": "2222", "score": 88, "typeId": "ababab", "reviewerId": "aaaa", "scoreCardId": "bbbbxxx", "submissionId": "fjfjfj", "created": "2018-01-02T00:00:00", "updated": "2018-02-03T00:00:00", "createdBy": "admin", "updatedBy": "user" } }`
143+
- reviewType:
144+
`{ "topic": "submission.notification.create", "originator": "submission-api", "timestamp": "2018-02-16T00:00:00", "mime-type": "application/json", "payload": { "resource": "reviewType", "id": "3333", "name": "some review type", "isActive": true } }`
145+
- reviewSummation:
146+
`{ "topic": "submission.notification.create", "originator": "submission-api", "timestamp": "2018-02-16T00:00:00", "mime-type": "application/json", "payload": { "resource": "reviewSummation", "id": "4444", "submissionId": "asdfasdf", "aggregateScore": 98, "scoreCardId": "abbccaaa", "isPassing": true, "created": "2018-01-02T00:00:00", "updated": "2018-02-03T00:00:00", "createdBy": "admin", "updatedBy": "user" } }`

config/default.js

Lines changed: 25 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,25 @@
1+
/**
2+
* The configuration file.
3+
*/
4+
module.exports = {
5+
LOG_LEVEL: process.env.LOG_LEVEL || 'debug',
6+
7+
KAFKA_URL: process.env.KAFKA_URL || 'localhost:9092',
8+
// below are used for secure Kafka connection, they are optional
9+
// for the local Kafka, they are not needed
10+
KAFKA_CLIENT_CERT: process.env.KAFKA_CLIENT_CERT,
11+
KAFKA_CLIENT_CERT_KEY: process.env.KAFKA_CLIENT_CERT_KEY,
12+
13+
CREATE_DATA_TOPIC: process.env.CREATE_DATA_TOPIC || 'submission.notification.create',
14+
UPDATE_DATA_TOPIC: process.env.UPDATE_DATA_TOPIC || 'submission.notification.update',
15+
DELETE_DATA_TOPIC: process.env.DELETE_DATA_TOPIC || 'submission.notification.delete',
16+
17+
// see https://www.elastic.co/guide/en/elasticsearch/client/javascript-api/current/configuration.html
18+
// for full config details, for SSL connection, see the ssl field details in above page
19+
ELASTICSEARCH_CONFIG: {
20+
host: process.env.ELASTICSEARCH_HOST || 'localhost:9200'
21+
},
22+
23+
ELASTICSEARCH_INDEX: process.env.ELASTICSEARCH_INDEX || 'submission-index',
24+
ELASTICSEARCH_INDEX_TYPE: process.env.ELASTICSEARCH_INDEX_TYPE || 'submission'
25+
}

docker/docker-compose.yml

Lines changed: 6 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,6 @@
1+
version: "2"
2+
services:
3+
esearch:
4+
image: "docker.elastic.co/elasticsearch/elasticsearch:6.3.1"
5+
ports:
6+
- "9200:9200"

0 commit comments

Comments
 (0)