You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
-`TOPCODER_SKILL_PROVIDER_ID`: the referenced skill provider id
40
+
41
+
-`esConfig.HOST`: the elasticsearch host
42
+
-`esConfig.ES_INDEX_JOB`: the job index
43
+
-`esConfig.ES_INDEX_JOB_CANDIDATE`: the job candidate index
44
+
-`esConfig.ES_INDEX_RESOURCE_BOOKING`: the resource booking index
45
+
-`esConfig.AWS_REGION`: The Amazon region to use when using AWS Elasticsearch service
46
+
-`esConfig.ELASTICCLOUD.id`: The elastic cloud id, if your elasticsearch instance is hosted on elastic cloud. DO NOT provide a value for ES_HOST if you are using this
47
+
-`esConfig.ELASTICCLOUD.username`: The elastic cloud username for basic authentication. Provide this only if your elasticsearch instance is hosted on elastic cloud
48
+
-`esConfig.ELASTICCLOUD.password`: The elastic cloud password for basic authentication. Provide this only if your elasticsearch instance is hosted on elastic cloud
49
+
50
+
-`BUSAPI_URL`: Topcoder Bus API URL
51
+
-`KAFKA_ERROR_TOPIC`: The error topic at which bus api will publish any errors
52
+
-`KAFKA_MESSAGE_ORIGINATOR`: The originator value for the kafka messages
53
+
54
+
-`TAAS_JOB_CREATE_TOPIC`: the create job entity Kafka message topic
55
+
-`TAAS_JOB_UPDATE_TOPIC`: the update job entity Kafka message topic
56
+
-`TAAS_JOB_DELETE_TOPIC`: the delete job entity Kafka message topic
57
+
-`TAAS_JOB_CANDIDATE_CREATE_TOPIC`: the create job candidate entity Kafka message topic
58
+
-`TAAS_JOB_CANDIDATE_UPDATE_TOPIC`: the update job candidate entity Kafka message topic
59
+
-`TAAS_JOB_CANDIDATE_DELETE_TOPIC`: the delete job candidate entity Kafka message topic
60
+
-`TAAS_RESOURCE_BOOKING_CREATE_TOPIC`: the create resource booking entity Kafka message topic
61
+
-`TAAS_RESOURCE_BOOKING_UPDATE_TOPIC`: the update resource booking entity Kafka message topic
62
+
-`TAAS_RESOURCE_BOOKING_DELETE_TOPIC`: the delete resource booking entity Kafka message topic
63
+
64
+
65
+
## PostgreSQL Database Setup
66
+
- Go to https://www.postgresql.org/ download and install the PostgreSQL.
67
+
- Modify `DATABASE_URL` under `config/default.js` to meet your environment.
68
+
- Run `npm run init-db` to create table(run `npm run init-db force` to force creating table)
69
+
13
70
## DB Migration
14
71
-`npm run migrate`: run any migration files which haven't run yet.
15
72
-`npm run migrate:undo`: revert most recent migration.
@@ -23,186 +80,27 @@ The following parameters can be set in the config file or via env variables:
23
80
-`database`: set via env `DB_NAME`; datebase name
24
81
-`host`: set via env `DB_HOST`; datebase host name
25
82
26
-
### Steps to run locally
27
-
1. 📦 Install npm dependencies
28
-
29
-
```bash
30
-
npm install
31
-
```
32
-
33
-
2. ⚙ Local config
34
-
35
-
1. In the root directory create `.env` file with the next environment variables. Values for **Auth0 config** should be shared with you on the forum.<br>
- Values from this file would be automatically used by many `npm` commands.
52
-
- ⚠️ Never commit this file or its copy to the repository!
53
-
54
-
1. Set `dockerhost` to point the IP address of Docker. Docker IP address depends on your system. For example if docker is run on IP `127.0.0.1` add a the next line to your `/etc/hosts` file:
55
-
```
56
-
127.0.0.1 dockerhost
57
-
```
58
-
59
-
Alternatively, you may update `.env` file and replace `dockerhost` with your docker IP address.
60
-
61
-
1. 🚢 Start docker-compose with services which are required to start Taas API locally
62
-
63
-
*(NOTE Please ensure that you have installed docker of version 20.10 or above since the docker-compose file uses new feature introduced by docker version 20.10. Run `docker --version` to check your docker version.)*
83
+
## ElasticSearch Setup
84
+
- Go to https://www.elastic.co/downloads/ download and install the elasticsearch.
85
+
- Modify `esConfig` under `config/default.js` to meet your environment.
86
+
- Run `npm run create-index` to create ES index.
87
+
- Run `npm run delete-index` to delete ES index.
64
88
65
-
```bash
66
-
npm run services:up
67
-
```
89
+
## Local Deployment
68
90
69
-
Wait until all containers are fully started. As a good indicator, waituntil`es-processor` successfully started by viewing its logs:
70
-
71
-
```bash
72
-
npm run services:logs -- -f es-processor
73
-
```
74
-
75
-
<details><summary>🖱️ Click to see a good logs example</summary>
91
+
- Install dependencies `npm install`
92
+
- Run lint `npm run lint`
93
+
- Run lint fix `npm run lint:fix`
94
+
- Clear and init db `npm run init-db force`
95
+
- Clear and create es index
76
96
77
97
```bash
78
-
tc-taas-es-processor | Waiting for kafka-client to exit....
tc-taas-es-processor | [2021-01-21T02:44:43.442Z] app INFO : Starting kafka consumer
85
-
tc-taas-es-processor | 2021-01-21T02:44:44.534Z INFO no-kafka-client Joined group taas-es-processor generationId 1 as no-kafka-client-70c25a43-af93-495e-a123-0c4f4ea389eb
86
-
tc-taas-es-processor | 2021-01-21T02:44:44.534Z INFO no-kafka-client Elected as group leader
- as many of the Topcoder services in this docker-compose require Auth0 configuration for M2M calls, our docker-compose file passes environment variables `AUTH0_CLIENT_ID`, `AUTH0_CLIENT_SECRET`, `AUTH0_URL`, `AUTH0_AUDIENCE`, `AUTH0_PROXY_SERVER_URL` to its containers. docker-compose takes them from `.env` file if provided.
120
-
121
-
- `docker-compose` automatically would create Kafka topics which are used by `taas-apis` listed in`./local/kafka-client/topics.txt`.
122
-
123
-
- To view the logs from any container inside docker-compose use the following command, replacing `SERVICE_NAME` with the corresponding value under the **Name** column in the above table:
124
-
125
-
```bash
126
-
npm run services:logs -- -f SERVICE_NAME
127
-
```
128
-
129
-
- If you want to modify the code of any of the services which are run inside this docker-compose file, you can stop such service inside docker-compose by command`docker-compose -f local/docker-compose.yaml stop <SERVICE_NAME>` and run the service separately, following its README file.<br /><br />
130
-
*NOTE: If kafka(along with zookeeper) is stopped and brings up in the host machine you will need to restart the `es-processor` service by running `docker-compose -f local/docker-compose.yaml restart es-processor` so the processor will connect with the new zookeeper.*
131
-
132
-
*NOTE: In production these dependencies / services are hosted & managed outside Taas API.*
133
-
134
-
2. ♻ Init DB and ES
135
-
136
-
```bash
137
-
npm run local:init
138
-
```
139
-
140
-
This command will do 2 things:
141
-
- create Database tables
142
-
- create Elasticsearch indexes
143
-
144
-
3. 🚀 Start Taas API
145
-
146
-
```bash
147
-
npm run dev
148
-
```
149
-
150
-
Runs the Taas API using nodemon, so it would be restarted after any of the files is updated.
151
-
The API will be served on `http://localhost:3000`.
152
-
153
-
## NPM Commands
154
-
155
-
| Command | Description |
156
-
| -- | -- |
157
-
|`npm start`| Start app. |
158
-
|`npm run dev`| Start app using `nodemon`. |
159
-
|`npm run lint`| Check forfor lint errors. |
160
-
|`npm run lint:fix`| Check forfor lint errors and fix error automatically when possible. |
161
-
|`npm run services:up`| Start services via docker-compose forlocal development. |
162
-
|`npm run services:down`| Stop services via docker-compose forlocal development. |
163
-
|`npm run services:logs -- -f <service_name>`| View logs of some service inside docker-compose. |
164
-
|`npm run local:init`| Create Database and Elasticsearch indexes. |
165
-
|`npm run init-db`| Create database. |
166
-
|`npm run init-db force`| Force re-creating database. |
167
-
|`npm run create-index`| Create Elasticsearch indexes. |
168
-
|`npm run delete-index`| Delete Elasticsearch indexes. |
169
-
|`npm run migrate`| Run DB migration. |
170
-
|`npm run migrate:undo`| Undo DB migration executed previously |
171
-
|`npm run test-data`| Insert test data. |
172
-
|`npm run test`| Run tests. |
173
-
|`npm run cov`| Run test with coverage. |
174
-
175
-
## Kafka Commands
176
-
177
-
You can use the following commands to manipulate kafka topics and messages:
178
-
179
-
(Replace `TOPIC_NAME` with the name of the desired topic)
0 commit comments