|
| 1 | +# Topcoder - Project Elasticsearch Processor |
| 2 | + |
| 3 | +## Dependencies |
| 4 | + |
| 5 | +- nodejs https://nodejs.org/en/ (v8+) |
| 6 | +- Kafka |
| 7 | +- ElasticSearch |
| 8 | +- Docker, Docker Compose |
| 9 | + |
| 10 | +## Configuration |
| 11 | + |
| 12 | +Configuration for the processor is at `config/default.js`. |
| 13 | +The following parameters can be set in config files or in env variables: |
| 14 | + |
| 15 | +- LOG_LEVEL: the log level; default value: 'debug' |
| 16 | +- KAFKA_URL: comma separated Kafka hosts; default value: 'localhost:9092' |
| 17 | +- KAFKA_GROUP_ID: the Kafka group id; default value: 'project-processor-es' |
| 18 | +- KAFKA_CLIENT_CERT: Kafka connection certificate, optional; default value is undefined; |
| 19 | + if not provided, then SSL connection is not used, direct insecure connection is used; |
| 20 | + if provided, it can be either path to certificate file or certificate content |
| 21 | +- KAFKA_CLIENT_CERT_KEY: Kafka connection private key, optional; default value is undefined; |
| 22 | + if not provided, then SSL connection is not used, direct insecure connection is used; |
| 23 | + if provided, it can be either path to private key file or private key content |
| 24 | +- CREATE_DATA_TOPIC: create data Kafka topic, default value is 'project.notification.create' |
| 25 | +- UPDATE_DATA_TOPIC: update data Kafka topic, default value is 'project.notification.update' |
| 26 | +- DELETE_DATA_TOPIC: delete data Kafka topic, default value is 'project.notification.delete' |
| 27 | +- KAFKA_MESSAGE_ORIGINATOR: Kafka topic originator, default value is 'project-api' |
| 28 | +- esConfig: config object for Elasticsearch |
| 29 | + |
| 30 | +Refer to `esConfig` variable in `config/default.js` for ES related configuration. |
| 31 | + |
| 32 | +Also note that there is a `/health` endpoint that checks for the health of the app. This sets up an expressjs server and listens on the environment variable `PORT`. It's not part of the configuration file and needs to be passed as an environment variable |
| 33 | + |
| 34 | +Config for tests are at `config/test.js`, it overrides some default config. |
| 35 | + |
| 36 | +## Local Kafka setup |
| 37 | +- Call extracted directory kafka_2.11-0.11.0.1 : `path_to_kafka` |
| 38 | +- Call our project root directory : `our_project_root_directory` |
| 39 | +- `http://kafka.apache.org/quickstart` contains details to setup and manage Kafka server, |
| 40 | + below provides details to setup Kafka server in Mac, Windows will use bat commands in bin/windows instead |
| 41 | +- Download kafka at `https://www.apache.org/dyn/closer.cgi?path=/kafka/1.1.0/kafka_2.11-1.1.0.tgz` |
| 42 | +- Extract out the doanlowded tgz file |
| 43 | +- Go to extracted directory kafka_2.11-0.11.0.1 |
| 44 | +- Start ZooKeeper server: |
| 45 | + `bin/zookeeper-server-start.sh config/zookeeper.properties` |
| 46 | +- Use another terminal, go to same directory, start the Kafka server: |
| 47 | + `bin/kafka-server-start.sh config/server.properties` |
| 48 | +- Note that the zookeeper server is at localhost:2181, and Kafka server is at localhost:9092 |
| 49 | +- Use another terminal, go to same directory, create some topics: |
| 50 | + `bin/kafka-topics.sh --create --zookeeper localhost:2181 --replication-factor 1 --partitions 1 --topic project.notification.create` |
| 51 | + `bin/kafka-topics.sh --create --zookeeper localhost:2181 --replication-factor 1 --partitions 1 --topic project.notification.update` |
| 52 | + `bin/kafka-topics.sh --create --zookeeper localhost:2181 --replication-factor 1 --partitions 1 --topic project.notification.delete` |
| 53 | +- Verify that the topics are created: |
| 54 | + `bin/kafka-topics.sh --list --zookeeper localhost:2181`, |
| 55 | + it should list out the created topics |
| 56 | +- run the producer and then write some message into the console to send to the `project.notification.create` topic: |
| 57 | + `bin/kafka-console-producer.sh --broker-list localhost:9092 --topic project.notification.create` |
| 58 | + in the console, write message, one message per line: |
| 59 | + `{"topic":"project.notification.create","originator":"project-api","timestamp":"2019-06-20T13:43:25.817Z","mime-type":"application/json","payload":{"resource":"project","createdAt":"2019-06-20T13:43:23.554Z","updatedAt":"2019-06-20T13:43:23.555Z","terms":[],"id":1,"name":"test project","description":"Hello I am a test project","type":"app","createdBy":40051333,"updatedBy":40051333,"projectEligibility":[],"bookmarks":[],"external":null,"status":"draft","lastActivityAt":"2019-06-20T13:43:23.514Z","lastActivityUserId":"40051333","members":[{"createdAt":"2019-06-20T13:43:23.555Z","updatedAt":"2019-06-20T13:43:23.625Z","id":2,"isPrimary":true,"role":"manager","userId":40051333,"updatedBy":40051333,"createdBy":40051333,"projectId":2,"deletedAt":null,"deletedBy":null}],"version":"v2","directProjectId":null,"billingAccountId":null,"estimatedPrice":null,"actualPrice":null,"details":null,"cancelReason":null,"templateId":null,"deletedBy":null,"attachments":null,"phases":null,"projectUrl":"https://connect.topcoder-dev.com/projects/2"}}` |
| 60 | +- Optionally, use another terminal, go to same directory, start a consumer to view the messages: |
| 61 | + `bin/kafka-console-consumer.sh --bootstrap-server localhost:9092 --topic project.notification.create --from-beginning` |
| 62 | +- If the kafka don't allow to input long message you can use this script to write message from file: |
| 63 | + `path_to_kafka/bin/kafka-console-producer.sh --broker-list localhost:9092 --topic project.notification.create < our_project_root_directory/test/test_topic/project/project.notification.create.json` |
| 64 | +- Writing/reading messages to/from other topics are similar. All example for messages are in: |
| 65 | +`our_project_root_directory/test/test_topic` |
| 66 | + |
| 67 | +## Local Elasticsearch setup |
| 68 | + |
| 69 | +- In the `docker-es` folder, run `docker-compose up` |
| 70 | + |
| 71 | +## Local deployment |
| 72 | + |
| 73 | +- Install dependencies `npm i` |
| 74 | +- Run code lint check `npm run lint`, running `npm run lint:fix` can fix some lint errors if any |
| 75 | +- Initialize Elasticsearch, create configured Elasticsearch index if not present: `npm run sync:es` |
| 76 | +- Start processor app `npm start` |
| 77 | + |
| 78 | +## Local Deployment with Docker |
| 79 | + |
| 80 | +To run the Challenge ES Processor using docker, follow the below steps |
| 81 | + |
| 82 | +1. Navigate to the directory `docker` |
| 83 | + |
| 84 | +2. Rename the file `sample.api.env` to `api.env` |
| 85 | + |
| 86 | +3. Set the required AWS credentials in the file `api.env` |
| 87 | + |
| 88 | +4. Once that is done, run the following command |
| 89 | + |
| 90 | +``` |
| 91 | +docker-compose up |
| 92 | +``` |
| 93 | + |
| 94 | +5. When you are running the application for the first time, It will take some time initially to download the image and install the dependencies |
| 95 | + |
| 96 | +## Integration tests |
| 97 | + |
| 98 | +Integration tests use different index `projects_test`, `timelines_test`, `metadata_test` which is not same as the usual index `projects`, `timelines`, `metadata`. |
| 99 | + |
| 100 | +While running tests, the index names could be overwritten using environment variables or leave it as it is to use the default test indices defined in `config/test.js` |
| 101 | + |
| 102 | +``` |
| 103 | +export ES_PROJECT_INDEX=projects_test |
| 104 | +export ES_TIMELINE_INDEX=timelines_test |
| 105 | +export ES_METADATA_INDEX=metadata_test |
| 106 | +``` |
| 107 | + |
| 108 | +#### Running integration tests and coverage |
| 109 | + |
| 110 | +To run test alone |
| 111 | + |
| 112 | +``` |
| 113 | +npm run test |
| 114 | +``` |
| 115 | + |
| 116 | +To run test with coverage report |
| 117 | + |
| 118 | +``` |
| 119 | +npm run test:cov |
| 120 | +``` |
| 121 | + |
| 122 | +## Verification |
| 123 | + |
| 124 | +- Call extracted directory kafka_2.11-0.11.0.1 : `path_to_kafka` |
| 125 | +- Call our project root directory : `our_project_root_directory` |
| 126 | +- Start kafka server, start elasticsearch, initialize Elasticsearch, start processor app |
| 127 | +- Send message: |
| 128 | + `path_to_kafka/bin/kafka-console-producer.sh --broker-list localhost:9092 --topic project.notification.create < our_project_root_directory/test/test_topic/project/project.notification.create.json` |
| 129 | +- run command `npm run view-data projects 1` to view the created data, you will see the data are properly created: |
| 130 | + |
| 131 | +```bash |
| 132 | +info: Elasticsearch Project data: |
| 133 | +info: { |
| 134 | + "createdAt": "2019-06-20T13:43:23.554Z", |
| 135 | + "updatedAt": "2019-06-20T13:43:23.555Z", |
| 136 | + "terms": [], |
| 137 | + "id": 1, |
| 138 | + "name": "test project", |
| 139 | + "description": "Hello I am a test project", |
| 140 | + "type": "app", |
| 141 | + "createdBy": 40051333, |
| 142 | + "updatedBy": 40051333, |
| 143 | + "projectEligibility": [], |
| 144 | + "bookmarks": [], |
| 145 | + "external": null, |
| 146 | + "status": "draft", |
| 147 | + "lastActivityAt": "2019-06-20T13:43:23.514Z", |
| 148 | + "lastActivityUserId": "40051333", |
| 149 | + "members": [ |
| 150 | + { |
| 151 | + "createdAt": "2019-06-20T13:43:23.555Z", |
| 152 | + "updatedAt": "2019-06-20T13:43:23.625Z", |
| 153 | + "id": 2, |
| 154 | + "isPrimary": true, |
| 155 | + "role": "manager", |
| 156 | + "userId": 40051333, |
| 157 | + "updatedBy": 40051333, |
| 158 | + "createdBy": 40051333, |
| 159 | + "projectId": 2, |
| 160 | + "deletedAt": null, |
| 161 | + "deletedBy": null |
| 162 | + } |
| 163 | + ], |
| 164 | + "version": "v2", |
| 165 | + "directProjectId": null, |
| 166 | + "billingAccountId": null, |
| 167 | + "estimatedPrice": null, |
| 168 | + "actualPrice": null, |
| 169 | + "details": null, |
| 170 | + "cancelReason": null, |
| 171 | + "templateId": null, |
| 172 | + "deletedBy": null, |
| 173 | + "attachments": null, |
| 174 | + "phases": null, |
| 175 | + "projectUrl": "https://connect.topcoder-dev.com/projects/2" |
| 176 | +} |
| 177 | +``` |
| 178 | + |
| 179 | + |
| 180 | +- Run the producer and then write some invalid message into the console to send to the `project.notification.create` topic: |
| 181 | + `bin/kafka-console-producer.sh --broker-list localhost:9092 --topic project.notification.create` |
| 182 | + in the console, write message, one message per line: |
| 183 | + `{ "topic": "project.notification.create", "originator": "project-api", "timestamp": "2019-02-16T00:00:00", "mime-type": "application/json", "payload": { "id": "invalid", "typeId": "8e17090c-465b-4c17-b6d9-dfa16300b0ff", "track": "Code", "name": "test", "description": "desc", "timelineTemplateId": "8e17090c-465b-4c17-b6d9-dfa16300b0aa", "phases": [{ "id": "8e17090c-465b-4c17-b6d9-dfa16300b012", "name": "review", "isActive": true, "duration": 10000 }], "prizeSets": [{ "type": "prize", "prizes": [{ "type": "winning prize", "value": 500 }] }], "reviewType": "code review", "tags": ["code"], "projectId": 123, "forumId": 456, "status": "Active", "created": "2019-02-16T00:00:00", "createdBy": "admin" } }` |
| 184 | + |
| 185 | + `{ "topic": "project.notification.create", "originator": "project-api", "timestamp": "2019-02-16T00:00:00", "mime-type": "application/json", "payload": { "id": "173803d3-019e-4033-b1cf-d7205c7f774c", "typeId": "8e17090c-465b-4c17-b6d9-dfa16300b0ff", "track": "Code", "name": "test", "description": "desc", "timelineTemplateId": "8e17090c-465b-4c17-b6d9-dfa16300b0aa", "phases": [{ "id": "8e17090c-465b-4c17-b6d9-dfa16300b012", "name": "review", "isActive": true, "duration": 10000 }], "prizeSets": [{ "type": "prize", "prizes": [{ "type": "winning prize", "value": 500 }] }], "reviewType": "code review", "tags": ["code"], "projectId": 123, "forumId": -456, "status": "Active", "created": "2018-01-02T00:00:00", "createdBy": "admin" } }` |
| 186 | + |
| 187 | + `{ [ { abc` |
| 188 | +- Then in the app console, you will see error messages |
| 189 | + |
| 190 | +- Sent message to update data: |
| 191 | + `path_to_kafka/bin/kafka-console-producer.sh --broker-list localhost:9092 --topic project.notification.update < our_project_root_directory/test/test_topic/project/project.notification.update.json` |
| 192 | +- Run command `npm run view-data projects 1` to view the updated data, you will see the data are properly updated: |
| 193 | + |
| 194 | +```bash |
| 195 | +info: Elasticsearch Project data: |
| 196 | +info: { |
| 197 | + "createdAt": "2019-06-20T13:43:23.554Z", |
| 198 | + "updatedAt": "2019-06-20T13:45:20.091Z", |
| 199 | + "terms": [], |
| 200 | + "id": 1, |
| 201 | + "name": "project name updated", |
| 202 | + "description": "Hello I am a test project", |
| 203 | + "type": "app", |
| 204 | + "createdBy": 40051333, |
| 205 | + "updatedBy": 40051333, |
| 206 | + "projectEligibility": [], |
| 207 | + "bookmarks": [], |
| 208 | + "external": null, |
| 209 | + "status": "draft", |
| 210 | + "lastActivityAt": "2019-06-20T13:43:23.514Z", |
| 211 | + "lastActivityUserId": "40051333", |
| 212 | + "members": [ |
| 213 | + { |
| 214 | + "createdAt": "2019-06-20T13:43:23.555Z", |
| 215 | + "deletedAt": null, |
| 216 | + "role": "manager", |
| 217 | + "updatedBy": 40051333, |
| 218 | + "createdBy": 40051333, |
| 219 | + "isPrimary": true, |
| 220 | + "id": 2, |
| 221 | + "userId": 40051333, |
| 222 | + "projectId": 2, |
| 223 | + "deletedBy": null, |
| 224 | + "updatedAt": "2019-06-20T13:43:23.625Z" |
| 225 | + } |
| 226 | + ], |
| 227 | + "version": "v2", |
| 228 | + "directProjectId": null, |
| 229 | + "billingAccountId": null, |
| 230 | + "estimatedPrice": null, |
| 231 | + "actualPrice": null, |
| 232 | + "details": null, |
| 233 | + "cancelReason": null, |
| 234 | + "templateId": null, |
| 235 | + "deletedBy": null, |
| 236 | + "attachments": [], |
| 237 | + "phases": [], |
| 238 | + "projectUrl": "https://connect.topcoder-dev.com/projects/2", |
| 239 | + "invites": [], |
| 240 | + "utm": null |
| 241 | +} |
| 242 | +``` |
| 243 | + |
| 244 | + |
| 245 | +- Run the producer and then write some invalid message into the console to send to the `project.notification.create` topic: |
| 246 | + `bin/kafka-console-producer.sh --broker-list localhost:9092 --topic project.notification.create` |
| 247 | + in the console, write message, one message per line: |
| 248 | + `{ "topic": "project.notification.update", "originator": "project-api", "timestamp": "2019-02-17T01:00:00", "mime-type": "application/json", "payload": { "id": "173803d3-019e-4033-b1cf-d7205c7f774c", "typeId": "123", "track": "Code", "name": "test3", "description": "desc3", "timelineTemplateId": "8e17090c-465b-4c17-b6d9-dfa16300b0dd", "groups": ["group2", "group3"], "updated": "2019-02-17T01:00:00", "updatedBy": "admin" } }` |
| 249 | + |
| 250 | + `{ "topic": "project.notification.update", "originator": "project-api", "timestamp": "2019-02-17T01:00:00", "mime-type": "application/json", "payload": { "id": "173803d3-019e-4033-b1cf-d7205c7f774c", "typeId": "8e17090c-465b-4c17-b6d9-dfa16300b0ff", "track": ["Code"], "name": "test3", "description": "desc3", "timelineTemplateId": "8e17090c-465b-4c17-b6d9-dfa16300b0dd", "groups": ["group2", "group3"], "updated": "2019-02-17T01:00:00", "updatedBy": "admin" } }` |
| 251 | + |
| 252 | + `[ [ [ } } }` |
| 253 | + |
| 254 | +- Then in the app console, you will see error messages |
| 255 | + |
| 256 | +- To test the health check API, run `export PORT=5000`, start the processor, then browse `http://localhost:5000/health` in a browser, |
| 257 | + and you will see result `{"checksRun":1}` |
0 commit comments