You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: README.md
+8-66Lines changed: 8 additions & 66 deletions
Original file line number
Diff line number
Diff line change
@@ -3,7 +3,6 @@
3
3
## Dependencies
4
4
5
5
- Nodejs(v12+)
6
-
- ElasticSearch
7
6
- Kafka
8
7
9
8
## Configuration
@@ -22,41 +21,10 @@ The following parameters can be set in config files or in env variables:
22
21
if provided, it can be either path to private key file or private key content
23
22
-`KAFKA_MESSAGE_ORIGINATOR`: The originator value for the kafka messages
24
23
-`KAFKA_GROUP_ID`: the Kafka group id
25
-
-`topics.KAFKA_ERROR_TOPIC`: the error topic at which bus api will publish any errors
26
24
-`topics.TAAS_JOB_CREATE_TOPIC`: the create job entity Kafka message topic
27
25
-`topics.TAAS_JOB_UPDATE_TOPIC`: the update job entity Kafka message topic
28
-
-`topics.TAAS_JOB_DELETE_TOPIC`: the delete job entity Kafka message topic
29
-
-`topics.TAAS_JOB_CANDIDATE_CREATE_TOPIC`: the create job candidate entity Kafka message topic
30
26
-`topics.TAAS_JOB_CANDIDATE_UPDATE_TOPIC`: the update job candidate entity Kafka message topic
31
-
-`topics.TAAS_JOB_CANDIDATE_DELETE_TOPIC`: the delete job candidate entity Kafka message topic
32
-
-`topics.TAAS_RESOURCE_BOOKING_CREATE_TOPIC`: the create resource booking entity Kafka message topic
33
-
-`topics.TAAS_RESOURCE_BOOKING_UPDATE_TOPIC`: the update resource booking entity Kafka message topic
34
-
-`topics.TAAS_RESOURCE_BOOKING_DELETE_TOPIC`: the delete resource booking entity Kafka message topic
35
-
-`topics.TAAS_WORK_PERIOD_CREATE_TOPIC`: the create work period entity Kafka message topic
36
-
-`topics.TAAS_WORK_PERIOD_UPDATE_TOPIC`: the update work period entity Kafka message topic
37
-
-`topics.TAAS_WORK_PERIOD_DELETE_TOPIC`: the delete work period entity Kafka message topic
38
-
-`topics.TAAS_WORK_PERIOD_PAYMENT_CREATE_TOPIC`: the create work period payment entity Kafka message topic
39
-
-`topics.TAAS_WORK_PERIOD_PAYMENT_UPDATE_TOPIC`: the update work period payment entity Kafka message topic
40
-
-`topics.TAAS_INTERVIEW_REQUEST_TOPIC`: the request interview entity Kafka message topic
41
-
-`topics.TAAS_INTERVIEW_UPDATE_TOPIC`: the update interview entity Kafka message topic
42
-
-`topics.TAAS_INTERVIEW_BULK_UPDATE_TOPIC`: the bulk update interview entity Kafka message topic
43
-
-`topics.TAAS_ROLE_CREATE_TOPIC`: the create role entity Kafka message topic
44
-
-`topics.TAAS_ROLE_UPDATE_TOPIC`: the update role entity Kafka message topic
45
-
-`topics.TAAS_ROLE_DELETE_TOPIC`: the delete role entity Kafka message topic
46
-
-`topics.TAAS_ACTION_RETRY_TOPIC`: the retry process Kafka message topic
47
-
-`MAX_RETRY`: maximum allowed retry count for failed operations for sending `taas.action.retry` message
48
-
-`BASE_RETRY_DELAY`: base amount of retry delay (ms) for failed operations
49
-
-`BUSAPI_URL`: Topcoder Bus API URL
50
-
-`esConfig.HOST`: Elasticsearch host
51
-
-`esConfig.AWS_REGION`: The Amazon region to use when using AWS Elasticsearch service
52
-
-`esConfig.ELASTICCLOUD.id`: The elastic cloud id, if your elasticsearch instance is hosted on elastic cloud. DO NOT provide a value for ES_HOST if you are using this
53
-
-`esConfig.ELASTICCLOUD.username`: The elastic cloud username for basic authentication. Provide this only if your elasticsearch instance is hosted on elastic cloud
54
-
-`esConfig.ELASTICCLOUD.password`: The elastic cloud password for basic authentication. Provide this only if your elasticsearch instance is hosted on elastic cloud
55
-
-`esConfig.ES_INDEX_JOB`: the index name for job
56
-
-`esConfig.ES_INDEX_JOB_CANDIDATE`: the index name for job candidate
57
-
-`esConfig.ES_INDEX_RESOURCE_BOOKING`: the index name for resource booking
58
-
-`esConfig.ES_INDEX_ROLE`: the index name for role
59
-
27
+
-`TAAS_API_URL`: the taas api url
60
28
-`auth0.AUTH0_URL`: Auth0 URL, used to get TC M2M token
61
29
-`auth0.AUTH0_AUDIENCE`: Auth0 audience, used to get TC M2M token
62
30
-`auth0.AUTH0_CLIENT_ID`: Auth0 client id, used to get TC M2M token
@@ -71,7 +39,7 @@ The following parameters can be set in config files or in env variables:
71
39
-`zapier.ZAPIER_JOB_CANDIDATE_SWITCH`: decides whether posting job candidate related message to zapier or not; possible values are `ON` and `OFF`, default is `OFF`
72
40
-`zapier.ZAPIER_JOB_CANDIDATE_WEBHOOK`: the remote zapier zap webhook url for posting job candidate related message
73
41
74
-
## Local Kafka and ElasticSearch setup
42
+
## Local Kafka setup
75
43
76
44
1. Navigate to the directory `local`
77
45
@@ -81,30 +49,21 @@ The following parameters can be set in config files or in env variables:
npm run delete-index # run this if you already created index
88
-
npm run create-index
89
-
```
90
-
91
52
## Local deployment
92
53
93
-
0. Make sure that Kafka and Elasticsearch is running as per instructions above.
94
-
95
-
1. Make sure to use Node v12+ by command`node -v`. We recommend using [NVM](https://github.com/nvm-sh/nvm) to quickly switch to the right version:
54
+
0. Make sure to use Node v12+ by command`node -v`. We recommend using [NVM](https://github.com/nvm-sh/nvm) to quickly switch to the right version:
96
55
97
56
```bash
98
57
nvm use
99
58
```
100
59
101
-
2. From the project root directory, run the following command to install the dependencies
60
+
1. From the project root directory, run the following command to install the dependencies
102
61
103
62
```bash
104
63
npm install
105
64
```
106
65
107
-
3. To run linters if required
66
+
2. To run linters if required
108
67
109
68
```bash
110
69
npm run lint
@@ -116,7 +75,7 @@ The following parameters can be set in config files or in env variables:
116
75
npm run lint:fix
117
76
```
118
77
119
-
4. Local config
78
+
3. Local config
120
79
121
80
In the `taas-es-processor` root directory create `.env` file with the next environment variables. Values for**Auth0 config** should be shared with you on the forum.<br>
122
81
@@ -131,7 +90,7 @@ The following parameters can be set in config files or in env variables:
131
90
- Values from this file would be automatically used by many `npm` commands.
132
91
- ⚠️ Never commit this file or its copy to the repository!
133
92
134
-
5. Start the processor and health check dropin
93
+
4. Start the processor and health check dropin
135
94
136
95
```bash
137
96
npm start
@@ -145,7 +104,7 @@ To run the processor using docker, follow the below steps
145
104
146
105
2. Rename the file `sample.api.env` to `api.env`
147
106
148
-
3. Set the required Kafka url and ElasticSearch host in the file `api.env`.
107
+
3. Set the required Kafka url in the file `api.env`.
149
108
150
109
Note that you can also add other variables to `api.env`, with `<key>=<value>` format per line.
151
110
If using AWS ES you should add `AWS_ACCESS_KEY_ID` and `AWS_SECRET_ACCESS_KEY` variables as well.
@@ -158,20 +117,3 @@ To run the processor using docker, follow the below steps
158
117
```
159
118
160
119
5. When you are running the application for the first time, It will take some time initially to download the image and install the dependencies
161
-
162
-
## Unit Tests and E2E Tests
163
-
164
-
### Unit Tests
165
-
- Run `npm run test` to execute unit tests.
166
-
- Run `npm run test:cov` to execute unit tests and generate coverage report.
167
-
168
-
### E2E Tests
169
-
Before running e2e tests, make sure index are created and the processor app is not running. Existing documents will be remove
170
-
from ES before and after tests.
171
-
172
-
- RUN `npm run e2e` to execute e2e tests.
173
-
- RUN `npm run e2e:cov` to execute e2e tests and generate coverage report.
0 commit comments