Skip to content

Commit fea8652

Browse files
Initial commit
0 parents  commit fea8652

File tree

98 files changed

+10583
-0
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

98 files changed

+10583
-0
lines changed

.dockerignore

Lines changed: 4 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,4 @@
1+
node_modules/
2+
.env
3+
coverage/
4+
.nyc_output/

.editorconfig

Lines changed: 24 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,24 @@
1+
# EditorConfig is awesome: http://EditorConfig.org
2+
3+
# top-most EditorConfig file
4+
root = true
5+
6+
# Unix-style newlines with a newline ending every file
7+
[*]
8+
end_of_line = lf
9+
insert_final_newline = true
10+
11+
# Matches multiple files with brace expansion notation
12+
# Set default charset
13+
[*.{js,py}]
14+
charset = utf-8
15+
16+
# Indentation override for all JS under lib directory
17+
[lib/**.js]
18+
indent_style = space
19+
indent_size = 2
20+
21+
# Matches the exact files either package.json or .travis.yml
22+
[{package.json,.travis.yml}]
23+
indent_style = space
24+
indent_size = 2

.gitignore

Lines changed: 7 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,7 @@
1+
.idea
2+
node_modules
3+
*.log
4+
.DS_Store
5+
.env
6+
.nyc_output
7+
coverage/

README.md

Lines changed: 257 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,257 @@
1+
# Topcoder - Project Elasticsearch Processor
2+
3+
## Dependencies
4+
5+
- nodejs https://nodejs.org/en/ (v8+)
6+
- Kafka
7+
- ElasticSearch
8+
- Docker, Docker Compose
9+
10+
## Configuration
11+
12+
Configuration for the processor is at `config/default.js`.
13+
The following parameters can be set in config files or in env variables:
14+
15+
- LOG_LEVEL: the log level; default value: 'debug'
16+
- KAFKA_URL: comma separated Kafka hosts; default value: 'localhost:9092'
17+
- KAFKA_GROUP_ID: the Kafka group id; default value: 'project-processor-es'
18+
- KAFKA_CLIENT_CERT: Kafka connection certificate, optional; default value is undefined;
19+
if not provided, then SSL connection is not used, direct insecure connection is used;
20+
if provided, it can be either path to certificate file or certificate content
21+
- KAFKA_CLIENT_CERT_KEY: Kafka connection private key, optional; default value is undefined;
22+
if not provided, then SSL connection is not used, direct insecure connection is used;
23+
if provided, it can be either path to private key file or private key content
24+
- CREATE_DATA_TOPIC: create data Kafka topic, default value is 'project.notification.create'
25+
- UPDATE_DATA_TOPIC: update data Kafka topic, default value is 'project.notification.update'
26+
- DELETE_DATA_TOPIC: delete data Kafka topic, default value is 'project.notification.delete'
27+
- KAFKA_MESSAGE_ORIGINATOR: Kafka topic originator, default value is 'project-api'
28+
- esConfig: config object for Elasticsearch
29+
30+
Refer to `esConfig` variable in `config/default.js` for ES related configuration.
31+
32+
Also note that there is a `/health` endpoint that checks for the health of the app. This sets up an expressjs server and listens on the environment variable `PORT`. It's not part of the configuration file and needs to be passed as an environment variable
33+
34+
Config for tests are at `config/test.js`, it overrides some default config.
35+
36+
## Local Kafka setup
37+
- Call extracted directory kafka_2.11-0.11.0.1 : `path_to_kafka`
38+
- Call our project root directory : `our_project_root_directory`
39+
- `http://kafka.apache.org/quickstart` contains details to setup and manage Kafka server,
40+
below provides details to setup Kafka server in Mac, Windows will use bat commands in bin/windows instead
41+
- Download kafka at `https://www.apache.org/dyn/closer.cgi?path=/kafka/1.1.0/kafka_2.11-1.1.0.tgz`
42+
- Extract out the doanlowded tgz file
43+
- Go to extracted directory kafka_2.11-0.11.0.1
44+
- Start ZooKeeper server:
45+
`bin/zookeeper-server-start.sh config/zookeeper.properties`
46+
- Use another terminal, go to same directory, start the Kafka server:
47+
`bin/kafka-server-start.sh config/server.properties`
48+
- Note that the zookeeper server is at localhost:2181, and Kafka server is at localhost:9092
49+
- Use another terminal, go to same directory, create some topics:
50+
`bin/kafka-topics.sh --create --zookeeper localhost:2181 --replication-factor 1 --partitions 1 --topic project.notification.create`
51+
`bin/kafka-topics.sh --create --zookeeper localhost:2181 --replication-factor 1 --partitions 1 --topic project.notification.update`
52+
`bin/kafka-topics.sh --create --zookeeper localhost:2181 --replication-factor 1 --partitions 1 --topic project.notification.delete`
53+
- Verify that the topics are created:
54+
`bin/kafka-topics.sh --list --zookeeper localhost:2181`,
55+
it should list out the created topics
56+
- run the producer and then write some message into the console to send to the `project.notification.create` topic:
57+
`bin/kafka-console-producer.sh --broker-list localhost:9092 --topic project.notification.create`
58+
in the console, write message, one message per line:
59+
`{"topic":"project.notification.create","originator":"project-api","timestamp":"2019-06-20T13:43:25.817Z","mime-type":"application/json","payload":{"resource":"project","createdAt":"2019-06-20T13:43:23.554Z","updatedAt":"2019-06-20T13:43:23.555Z","terms":[],"id":1,"name":"test project","description":"Hello I am a test project","type":"app","createdBy":40051333,"updatedBy":40051333,"projectEligibility":[],"bookmarks":[],"external":null,"status":"draft","lastActivityAt":"2019-06-20T13:43:23.514Z","lastActivityUserId":"40051333","members":[{"createdAt":"2019-06-20T13:43:23.555Z","updatedAt":"2019-06-20T13:43:23.625Z","id":2,"isPrimary":true,"role":"manager","userId":40051333,"updatedBy":40051333,"createdBy":40051333,"projectId":2,"deletedAt":null,"deletedBy":null}],"version":"v2","directProjectId":null,"billingAccountId":null,"estimatedPrice":null,"actualPrice":null,"details":null,"cancelReason":null,"templateId":null,"deletedBy":null,"attachments":null,"phases":null,"projectUrl":"https://connect.topcoder-dev.com/projects/2"}}`
60+
- Optionally, use another terminal, go to same directory, start a consumer to view the messages:
61+
`bin/kafka-console-consumer.sh --bootstrap-server localhost:9092 --topic project.notification.create --from-beginning`
62+
- If the kafka don't allow to input long message you can use this script to write message from file:
63+
`path_to_kafka/bin/kafka-console-producer.sh --broker-list localhost:9092 --topic project.notification.create < our_project_root_directory/test/test_topic/project/project.notification.create.json`
64+
- Writing/reading messages to/from other topics are similar. All example for messages are in:
65+
`our_project_root_directory/test/test_topic`
66+
67+
## Local Elasticsearch setup
68+
69+
- In the `docker-es` folder, run `docker-compose up`
70+
71+
## Local deployment
72+
73+
- Install dependencies `npm i`
74+
- Run code lint check `npm run lint`, running `npm run lint:fix` can fix some lint errors if any
75+
- Initialize Elasticsearch, create configured Elasticsearch index if not present: `npm run sync:es`
76+
- Start processor app `npm start`
77+
78+
## Local Deployment with Docker
79+
80+
To run the Challenge ES Processor using docker, follow the below steps
81+
82+
1. Navigate to the directory `docker`
83+
84+
2. Rename the file `sample.api.env` to `api.env`
85+
86+
3. Set the required AWS credentials in the file `api.env`
87+
88+
4. Once that is done, run the following command
89+
90+
```
91+
docker-compose up
92+
```
93+
94+
5. When you are running the application for the first time, It will take some time initially to download the image and install the dependencies
95+
96+
## Integration tests
97+
98+
Integration tests use different index `projects_test`, `timelines_test`, `metadata_test` which is not same as the usual index `projects`, `timelines`, `metadata`.
99+
100+
While running tests, the index names could be overwritten using environment variables or leave it as it is to use the default test indices defined in `config/test.js`
101+
102+
```
103+
export ES_PROJECT_INDEX=projects_test
104+
export ES_TIMELINE_INDEX=timelines_test
105+
export ES_METADATA_INDEX=metadata_test
106+
```
107+
108+
#### Running integration tests and coverage
109+
110+
To run test alone
111+
112+
```
113+
npm run test
114+
```
115+
116+
To run test with coverage report
117+
118+
```
119+
npm run test:cov
120+
```
121+
122+
## Verification
123+
124+
- Call extracted directory kafka_2.11-0.11.0.1 : `path_to_kafka`
125+
- Call our project root directory : `our_project_root_directory`
126+
- Start kafka server, start elasticsearch, initialize Elasticsearch, start processor app
127+
- Send message:
128+
`path_to_kafka/bin/kafka-console-producer.sh --broker-list localhost:9092 --topic project.notification.create < our_project_root_directory/test/test_topic/project/project.notification.create.json`
129+
- run command `npm run view-data projects 1` to view the created data, you will see the data are properly created:
130+
131+
```bash
132+
info: Elasticsearch Project data:
133+
info: {
134+
"createdAt": "2019-06-20T13:43:23.554Z",
135+
"updatedAt": "2019-06-20T13:43:23.555Z",
136+
"terms": [],
137+
"id": 1,
138+
"name": "test project",
139+
"description": "Hello I am a test project",
140+
"type": "app",
141+
"createdBy": 40051333,
142+
"updatedBy": 40051333,
143+
"projectEligibility": [],
144+
"bookmarks": [],
145+
"external": null,
146+
"status": "draft",
147+
"lastActivityAt": "2019-06-20T13:43:23.514Z",
148+
"lastActivityUserId": "40051333",
149+
"members": [
150+
{
151+
"createdAt": "2019-06-20T13:43:23.555Z",
152+
"updatedAt": "2019-06-20T13:43:23.625Z",
153+
"id": 2,
154+
"isPrimary": true,
155+
"role": "manager",
156+
"userId": 40051333,
157+
"updatedBy": 40051333,
158+
"createdBy": 40051333,
159+
"projectId": 2,
160+
"deletedAt": null,
161+
"deletedBy": null
162+
}
163+
],
164+
"version": "v2",
165+
"directProjectId": null,
166+
"billingAccountId": null,
167+
"estimatedPrice": null,
168+
"actualPrice": null,
169+
"details": null,
170+
"cancelReason": null,
171+
"templateId": null,
172+
"deletedBy": null,
173+
"attachments": null,
174+
"phases": null,
175+
"projectUrl": "https://connect.topcoder-dev.com/projects/2"
176+
}
177+
```
178+
179+
180+
- Run the producer and then write some invalid message into the console to send to the `project.notification.create` topic:
181+
`bin/kafka-console-producer.sh --broker-list localhost:9092 --topic project.notification.create`
182+
in the console, write message, one message per line:
183+
`{ "topic": "project.notification.create", "originator": "project-api", "timestamp": "2019-02-16T00:00:00", "mime-type": "application/json", "payload": { "id": "invalid", "typeId": "8e17090c-465b-4c17-b6d9-dfa16300b0ff", "track": "Code", "name": "test", "description": "desc", "timelineTemplateId": "8e17090c-465b-4c17-b6d9-dfa16300b0aa", "phases": [{ "id": "8e17090c-465b-4c17-b6d9-dfa16300b012", "name": "review", "isActive": true, "duration": 10000 }], "prizeSets": [{ "type": "prize", "prizes": [{ "type": "winning prize", "value": 500 }] }], "reviewType": "code review", "tags": ["code"], "projectId": 123, "forumId": 456, "status": "Active", "created": "2019-02-16T00:00:00", "createdBy": "admin" } }`
184+
185+
`{ "topic": "project.notification.create", "originator": "project-api", "timestamp": "2019-02-16T00:00:00", "mime-type": "application/json", "payload": { "id": "173803d3-019e-4033-b1cf-d7205c7f774c", "typeId": "8e17090c-465b-4c17-b6d9-dfa16300b0ff", "track": "Code", "name": "test", "description": "desc", "timelineTemplateId": "8e17090c-465b-4c17-b6d9-dfa16300b0aa", "phases": [{ "id": "8e17090c-465b-4c17-b6d9-dfa16300b012", "name": "review", "isActive": true, "duration": 10000 }], "prizeSets": [{ "type": "prize", "prizes": [{ "type": "winning prize", "value": 500 }] }], "reviewType": "code review", "tags": ["code"], "projectId": 123, "forumId": -456, "status": "Active", "created": "2018-01-02T00:00:00", "createdBy": "admin" } }`
186+
187+
`{ [ { abc`
188+
- Then in the app console, you will see error messages
189+
190+
- Sent message to update data:
191+
`path_to_kafka/bin/kafka-console-producer.sh --broker-list localhost:9092 --topic project.notification.update < our_project_root_directory/test/test_topic/project/project.notification.update.json`
192+
- Run command `npm run view-data projects 1` to view the updated data, you will see the data are properly updated:
193+
194+
```bash
195+
info: Elasticsearch Project data:
196+
info: {
197+
"createdAt": "2019-06-20T13:43:23.554Z",
198+
"updatedAt": "2019-06-20T13:45:20.091Z",
199+
"terms": [],
200+
"id": 1,
201+
"name": "project name updated",
202+
"description": "Hello I am a test project",
203+
"type": "app",
204+
"createdBy": 40051333,
205+
"updatedBy": 40051333,
206+
"projectEligibility": [],
207+
"bookmarks": [],
208+
"external": null,
209+
"status": "draft",
210+
"lastActivityAt": "2019-06-20T13:43:23.514Z",
211+
"lastActivityUserId": "40051333",
212+
"members": [
213+
{
214+
"createdAt": "2019-06-20T13:43:23.555Z",
215+
"deletedAt": null,
216+
"role": "manager",
217+
"updatedBy": 40051333,
218+
"createdBy": 40051333,
219+
"isPrimary": true,
220+
"id": 2,
221+
"userId": 40051333,
222+
"projectId": 2,
223+
"deletedBy": null,
224+
"updatedAt": "2019-06-20T13:43:23.625Z"
225+
}
226+
],
227+
"version": "v2",
228+
"directProjectId": null,
229+
"billingAccountId": null,
230+
"estimatedPrice": null,
231+
"actualPrice": null,
232+
"details": null,
233+
"cancelReason": null,
234+
"templateId": null,
235+
"deletedBy": null,
236+
"attachments": [],
237+
"phases": [],
238+
"projectUrl": "https://connect.topcoder-dev.com/projects/2",
239+
"invites": [],
240+
"utm": null
241+
}
242+
```
243+
244+
245+
- Run the producer and then write some invalid message into the console to send to the `project.notification.create` topic:
246+
`bin/kafka-console-producer.sh --broker-list localhost:9092 --topic project.notification.create`
247+
in the console, write message, one message per line:
248+
`{ "topic": "project.notification.update", "originator": "project-api", "timestamp": "2019-02-17T01:00:00", "mime-type": "application/json", "payload": { "id": "173803d3-019e-4033-b1cf-d7205c7f774c", "typeId": "123", "track": "Code", "name": "test3", "description": "desc3", "timelineTemplateId": "8e17090c-465b-4c17-b6d9-dfa16300b0dd", "groups": ["group2", "group3"], "updated": "2019-02-17T01:00:00", "updatedBy": "admin" } }`
249+
250+
`{ "topic": "project.notification.update", "originator": "project-api", "timestamp": "2019-02-17T01:00:00", "mime-type": "application/json", "payload": { "id": "173803d3-019e-4033-b1cf-d7205c7f774c", "typeId": "8e17090c-465b-4c17-b6d9-dfa16300b0ff", "track": ["Code"], "name": "test3", "description": "desc3", "timelineTemplateId": "8e17090c-465b-4c17-b6d9-dfa16300b0dd", "groups": ["group2", "group3"], "updated": "2019-02-17T01:00:00", "updatedBy": "admin" } }`
251+
252+
`[ [ [ } } }`
253+
254+
- Then in the app console, you will see error messages
255+
256+
- To test the health check API, run `export PORT=5000`, start the processor, then browse `http://localhost:5000/health` in a browser,
257+
and you will see result `{"checksRun":1}`

config/default.js

Lines changed: 30 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,30 @@
1+
/**
2+
* The default configuration file.
3+
*/
4+
5+
module.exports = {
6+
LOG_LEVEL: process.env.LOG_LEVEL || 'debug',
7+
8+
KAFKA_URL: process.env.KAFKA_URL || 'localhost:9092',
9+
KAFKA_GROUP_ID: process.env.KAFKA_GROUP_ID || 'project-processor-es',
10+
// below are used for secure Kafka connection, they are optional
11+
// for the local Kafka, they are not needed
12+
KAFKA_CLIENT_CERT: process.env.KAFKA_CLIENT_CERT,
13+
KAFKA_CLIENT_CERT_KEY: process.env.KAFKA_CLIENT_CERT_KEY,
14+
15+
CREATE_DATA_TOPIC: process.env.CREATE_DATA_TOPIC || 'project.notification.create',
16+
UPDATE_DATA_TOPIC: process.env.UPDATE_DATA_TOPIC || 'project.notification.update',
17+
DELETE_DATA_TOPIC: process.env.DELETE_DATA_TOPIC || 'project.notification.delete',
18+
KAFKA_MESSAGE_ORIGINATOR: process.env.KAFKA_MESSAGE_ORIGINATOR || 'project-api',
19+
20+
esConfig: {
21+
HOST: process.env.ES_HOST || 'localhost:9200',
22+
AWS_REGION: process.env.AWS_REGION || 'us-east-1', // AWS Region to be used if we use AWS ES
23+
API_VERSION: process.env.ES_API_VERSION || '6.7',
24+
ES_PROJECT_INDEX: process.env.ES_PROJECT_INDEX || 'projects',
25+
ES_TIMELINE_INDEX: process.env.ES_TIMELINE_INDEX || 'timelines',
26+
ES_METADATA_INDEX: process.env.ES_METADATA_INDEX || 'metadata',
27+
ES_TYPE: process.env.ES_TYPE || '_doc', // ES 6.x accepts only 1 Type per index and it's mandatory to define it
28+
ES_METADATA_DEFAULT_ID: process.env.ES_METADATA_DEFAULT_ID || 1 // use for setting default id of metadata
29+
}
30+
}

config/production.js

Lines changed: 7 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,7 @@
1+
/**
2+
* Production configuration file
3+
*/
4+
5+
module.exports = {
6+
LOG_LEVEL: process.env.LOG_LEVEL || 'info'
7+
}

config/test.js

Lines changed: 11 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,11 @@
1+
/**
2+
* Configuration file to be used while running tests
3+
*/
4+
5+
module.exports = {
6+
esConfig: {
7+
ES_PROJECT_INDEX: process.env.ES_PROJECT_INDEX || 'projects_test',
8+
ES_TIMELINE_INDEX: process.env.ES_TIMELINE_INDEX || 'timelines_test',
9+
ES_METADATA_INDEX: process.env.ES_METADATA_INDEX || 'metadata_test'
10+
}
11+
}

docker-es/docker-compose.yml

Lines changed: 6 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,6 @@
1+
version: "2"
2+
services:
3+
esearch:
4+
image: "docker.elastic.co/elasticsearch/elasticsearch:6.7.2"
5+
ports:
6+
- "9200:9200"

docker/Dockerfile

Lines changed: 12 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,12 @@
1+
# Use the base image with Node.js 8.11.3
2+
FROM node:8.11.3
3+
4+
# Copy the current directory into the Docker image
5+
COPY . /project-processor-es
6+
7+
# Set working directory for future use
8+
WORKDIR /project-processor-es
9+
10+
# Install the dependencies from package.json
11+
RUN npm install
12+
CMD npm start

docker/docker-compose.yml

Lines changed: 10 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,10 @@
1+
version: '3'
2+
services:
3+
project-processor-es:
4+
image: project-processor-es:latest
5+
build:
6+
context: ../
7+
dockerfile: docker/Dockerfile
8+
env_file:
9+
- api.env
10+
network_mode: "host"

docker/sample.api.env

Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,3 @@
1+
AWS_ACCESS_KEY_ID=<AWS Access Key ID>
2+
AWS_SECRET_ACCESS_KEY=<AWS Secret Access Key>
3+
ES_HOST=<ES Host Endpoint>

0 commit comments

Comments
 (0)