You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
@@ -4,22 +4,24 @@ Microservice to manage CRUD operations for all things Projects.
4
4
5
5
**Note : Steps mentioned below are best to our capability as guide for local deployment, however, we expect from contributor, being a developer, to resolve run-time issues (e.g. OS and node version issues etc), if any.**
6
6
7
-
-[Local Development](#local-development)
8
-
-[Requirements](#requirements)
9
-
-[Steps to run locally](#steps-to-run-locally)
10
-
-[Export database to json file](#export-database-to-json-file)
11
-
-[Import database from json file, and index it](#import-database-from-json-file-and-index-it)
12
-
-[Import sample metadata projects(Using projects service apis)](#import-sample-metadata-projectsusing-projects-service-apis)
13
-
-[Run Connect App with Project Service locally](#run-connect-app-with-project-service-locally)
14
-
-[Test](#test)
15
-
-[JWT Authentication](#jwt-authentication)
16
-
-[Deploying with docker (might need updates)](#deploying-with-docker-might-need-updates)
17
-
-[Kafka commands](#kafka-commands)
18
-
-[Create Topic](#create-topic)
19
-
-[List Topics](#list-topics)
20
-
-[Watch Topic](#watch-topic)
21
-
-[Post Message to Topic (from stdin)](#post-message-to-topic-from-stdin)
1. In the `tc-project-service` root directory create `.env` file with the environment variables _(values should be shared with you on the forum)_:<br>
44
46
```
@@ -48,7 +50,7 @@ Local setup should work good on **Linux** and **macOS**. But **Windows** is not
48
50
AUTH0_AUDIENCE=...
49
51
AUTH0_PROXY_SERVER_URL=...
50
52
```
51
-
Values from this file would be automatically used by `docker-compose` and command `npm run start:dev` below.
53
+
Values from this file would be automatically used by `docker-compose` , command `npm run start:dev` and some other command during local development.
52
54
53
55
2. Copy config file `config/m2m.local.js` into `config/local.js`:
54
56
```bash
@@ -62,68 +64,40 @@ Local setup should work good on **Linux** and **macOS**. But **Windows** is not
62
64
63
65
Alternatively, you may update `config/local.js` and replace `dockerhost` with your docker IP address.
64
66
65
-
3. Start **ONE** of the docker-compose files with dependant services which are required for Project Service to work
67
+
3. 🚢 Start docker-compose with services which are required to start Project Service locally
66
68
67
-
1. **Minimal** `./local/docker-compose.yml`:
68
-
69
-
*Use this docker-compose if you only want to test and modify code of Project Service and you don't need Elasticsearch (ES) to work.*
70
-
71
-
Run, in the project root folder:
72
-
```bash
73
-
docker-compose -f local/docker-compose.yml up
74
-
```
75
-
76
-
<details><summary>Click to see details</summary>
77
-
<br>
78
-
79
-
This docker-compose file starts the next services:
80
-
| Service | Name | Port |
81
-
|----------|:-----:|:----:|
82
-
| PostgreSQL | db | 5432 |
83
-
| Elasticsearch | esearch | 9200 |
84
-
| RabbitMQ | queue | 5672, 15672 |
85
-
| Mock Service (not in use) | jsonserver | 3001 |
86
-
87
-
</details>
88
-
89
-
2. **Full** `./local/full/docker-compose.yml`:
69
+
```bash
70
+
npm run local:run-docker
71
+
```
90
72
91
-
*Use this docker-compose if you want to test and modify code of Project Service together with one of the next relative services: [tc-bus-api](https://github.com/topcoder-platform/tc-bus-api), [project-processor-es](https://github.com/topcoder-platform/project-processor-es), [tc-notifications](https://github.com/topcoder-platform/tc-notifications) or you need Elasticsearch (ES) to work.*
73
+
Wait until all containers are fully started. As a good indicator, wait until `project-processor-es` successfully started by viewing its logs:
project-processor-es_1 | Waiting for kafka-client to exit....
103
86
```
104
87
105
-
<details><summary>Click to see example logs</summary>
106
-
<br>
107
-
108
-
- first it would be waiting for `kafka-client` to create all the required topics and exit, you would see:
109
-
110
-
```
111
-
project-processor-es_1 | Waiting for kafka-client to exit....
112
-
```
113
-
114
-
- after that, `project-processor-es` would be started itself. Make sure it successfully connected to Kafka, you should see 3 lines with text `Subscribed to project.action.`:
- after that, `project-processor-es` would be started itself. Make sure it successfully connected to Kafka, you should see 3 lines with text `Subscribed to project.action.`:
122
89
123
-
3. If you want to modify the code of any of the services which are run inside this docker-compose file, you can stop such service inside docker-compose by command `docker-compose -f local/full/docker-compose.yml stop -f <SERVICE_NAME>` and run the service separately, following its README file.
- as many of the Topcoder services which are run in this docker-compose require Auth0 configuration for M2M calls, that's why if we want to start this docker-compose file, we have to set environment variables `AUTH0_CLIENT_ID`, `AUTH0_CLIENT_SECRET`, `AUTH0_URL`, `AUTH0_AUDIENCE`, `AUTH0_PROXY_SERVER_URL` first and they would be passed inside containers.
116
+
- as many of the Topcoder services in this docker-compose require Auth0 configuration for M2M calls, our docker-compose file passes environment variables `AUTH0_CLIENT_ID`, `AUTH0_CLIENT_SECRET`, `AUTH0_URL`, `AUTH0_AUDIENCE`, `AUTH0_PROXY_SERVER_URL` to its containers. docker-compose takes them from `.env` file if provided.
143
117
144
118
- `docker-compose` automatically would create Kafka topics which are used by `tc-project-service` listed in `local/full/kafka-client/topics.txt`.
145
119
@@ -149,25 +123,46 @@ Local setup should work good on **Linux** and **macOS**. But **Windows** is not
- If you want to modify the code of any of the services which are run inside this docker-compose file, you can stop such service inside docker-compose by command `docker-compose -f local/full/docker-compose.yml stop -f <SERVICE_NAME>` and run the service separately, following its README file.
153
127
154
-
*NOTE: In production these dependencies / services are hosted & managed outside Project Service.*
128
+
- We also have a minimal docker-compose which doesn't start all the required services. Use it only if are sure that you don't need all the services.
129
+
130
+
<details><summary>Click to see details about minimal docker-compose</summary>
131
+
<br>
132
+
133
+
*Use this docker-compose if you only want to test and modify code of Project Service and you don't need Elasticsearch (ES) to work.*
134
+
135
+
Run, in the project root folder:
136
+
```bash
137
+
docker-compose -f local/docker-compose.yml up -d
138
+
```
139
+
140
+
This docker-compose file starts the next services:
141
+
| Service | Name | Port |
142
+
|----------|:-----:|:----:|
143
+
| PostgreSQL | db | 5432 |
144
+
| Elasticsearch | esearch | 9200 |
145
+
| RabbitMQ | queue | 5672, 15672 |
146
+
| Mock Service (not in use) | jsonserver | 3001 |
155
147
156
-
4. Create tables in DB
157
-
```bash
158
-
NODE_ENV=development npm run sync:db
159
-
```
148
+
</details>
160
149
161
-
*NOTE: this will drop tables if they already exist.*
150
+
</details>
162
151
163
-
5. Create ES (Elasticsearch) indexes
164
-
```bash
165
-
NODE_ENV=development npm run sync:es
166
-
```
152
+
*NOTE: In production these dependencies / services are hosted & managed outside Project Service.*
153
+
154
+
4. ♻ Init DB, ES and demo data (it clears any existent data)
155
+
156
+
```bash
157
+
npm run local:setup
158
+
```
167
159
168
-
*NOTE: This will first clear all the indices and than recreate them. So use with caution.*
160
+
This command will do 3 things:
161
+
- create Database tables (remove if exists)
162
+
- create Elasticsearch indexes (remove if exists)
163
+
- import demo data from `data/demo-data.json`
169
164
170
-
6. Start Project Service
165
+
5. 🚀 Start Project Service
171
166
172
167
```bash
173
168
npm run start:dev
@@ -176,7 +171,7 @@ Local setup should work good on **Linux** and **macOS**. But **Windows** is not
176
171
Runs the Project Service using nodemon, so it would be restarted after any of the files is updated.
177
172
The project service will be served on `http://localhost:8001`.
178
173
179
-
7.*(Optional)* Start Project Service Kafka Consumer
174
+
6.*(Optional)* Start Project Service Kafka Consumer
180
175
181
176
*Run this only if you want to test or modify logic of `lastActivityAt` or `lastActivityBy`.*
182
177
@@ -186,56 +181,55 @@ Local setup should work good on **Linux** and **macOS**. But **Windows** is not
186
181
npm run startKafkaConsumers:dev
187
182
```
188
183
189
-
### Export database to json file
184
+
### Import and Export data
190
185
191
-
To export data of certain models from database to json file.
186
+
#### 📤 Export data
192
187
193
-
`npm run data:export -- --file path/to-file.json`
194
-
195
-
List of models that will be exported is defined in `scripts/data/dataModels.js`. You can add new models to this list,
196
-
but make sure that new models are added to list such that each model comes after its dependencies.
197
-
198
-
When we run `npm run data:export` without specifying json file , data will be exported to `data/demo-data.json`.
199
-
200
-
### Import database from json file, and index it
188
+
To export data to the default file `data/demo-data.json`, run:
189
+
```bash
190
+
npm run data:export
191
+
```
201
192
202
-
To import data of certain models from file to database, and create ES indices.
193
+
If you want to export data to another file, run:
203
194
204
-
`npm run data:import -- --file path/to-file.json`
195
+
```bash
196
+
npm run data:export -- --file path/to-file.json
197
+
```
205
198
206
-
List of models that will be imported is defined in `scripts/data/dataModels.js`. You can add new models to this list,
207
-
but make sure that new models are added to list such that each model comes after its dependencies.
199
+
- List of models that will be exported are defined in `scripts/data/dataModels.js`. You can add new models to this list, but make sure that new models are added to list such that each model comes after its dependencies.
208
200
209
-
When we run `npm run data:import` without specifying json file , data will be imported from `data/demo-data.json`.
210
-
There is sample file located at `data/demo-data.json` that can used to import and index sample data.
201
+
#### 📥 Import data
211
202
212
-
Because this commands calls topcoder services to get data like members details, so you have to set environment variables AUTH0_CLIENT_ID, AUTH0_CLIENT_SECRET, AUTH0_URL, AUTH0_AUDIENCE, AUTH0_PROXY_SERVER_URL
203
+
*During importing, data would be first imported to the database, and after from the database it would be indexed to the Elasticsearch index.*
213
204
214
-
If you encounter conflicts errors, you may need to clear database using `npm run sync:db`, and clear ES (Elasticsearch) indices using `NODE_ENV=development npm run sync:es`
205
+
To import data from the default file `data/demo-data.json`, run:
206
+
```bash
207
+
npm run data:import
208
+
```
215
209
216
-
### Import sample metadata projects(Using projects service apis)
210
+
If you want to import data from another file, run:
217
211
218
212
```bash
219
-
CONNECT_USER_TOKEN=<connect user token> npm run demo-data
213
+
npm run data:import -- --file path/to-file.json
220
214
```
221
-
To retrieve data from DEV env we have to provide a valid user token (`CONNECT_USER_TOKEN`). You may login to http://connect.topcoder-dev.com and find the Bearer token in the request headers using browser dev tools.
222
215
223
-
This command for importing data uses API to create demo data. Which has a few pecularities:
224
-
- data in DB would be for sure created
225
-
- data in ElasticSearch Index (ES) would be only created if services [project-processor-es](https://github.com/topcoder-platform/project-processor-es) and [tc-bus-api](https://github.com/topcoder-platform/tc-bus-api) are also started locally. If you don't start them, then imported data wouldn't be indexed in ES, and would be only added to DB. You may start them locally separately, or better use `local/full/docker-compose.yml` as described [next section](#local-deployment-with-other-topcoder-services) which would start them automatically.
226
-
-**NOTE** During data importing a lot of records has to be indexed in ES, so you have to wait about 5-10 minutes after `npm run demo-data` is finished until imported data is indexed in ES. You may watch logs of `project-processor-es` to see if its done or no.
216
+
- As this commands calls topcoder services to get data like members details, so you have to provide environment variables `AUTH0_CLIENT_ID`, `AUTH0_CLIENT_SECRET`, `AUTH0_URL`, `AUTH0_AUDIENCE`, `AUTH0_PROXY_SERVER_URL`, they would automatically picked up from the `.env` file if provided.
217
+
218
+
- If you encounter conflicts errors during import, you may need to clear database using `NODE_ENV=development npm run sync:db`, and clear ES (Elasticsearch) indices using `NODE_ENV=development npm run sync:es`.
219
+
220
+
- List of models that will be imported are defined in `scripts/data/dataModels.js`. You can add new models to this list, but make sure that new models are added to list such that each model comes after its dependencies.
227
221
228
222
### Run Connect App with Project Service locally
229
223
230
224
To be able to run [Connect App](https://github.com/appirio-tech/connect-app) with the local setup of Project Service we have to do two things:
231
-
1.Configurate Connect App to use locally deployed Project service inside `connect-app/config/constants/dev.js` set
225
+
1.Configure Connect App to use locally deployed Project service inside `connect-app/config/constants/dev.js` set
232
226
233
227
```js
234
228
PROJECTS_API_URL:'http://localhost:8001'
235
229
TC_NOTIFICATION_URL:'http://localhost:4000/v5/notifications' # if tc-notfication-api has been locally deployed
236
230
```
237
231
238
-
2. Bypass token validation in Project Service.
232
+
1. Bypass token validation in Project Service.
239
233
240
234
In `tc-project-service/node_modules/tc-core-library-js/lib/auth/verifier.js` add this to line 23:
241
235
```js
@@ -246,7 +240,19 @@ To be able to run [Connect App](https://github.com/appirio-tech/connect-app) wit
246
240
247
241
*NOTE: this change only let us bypass validation during local development process*.
248
242
249
-
3. Restart both Connect App and Project Service if they were running.
243
+
2. Restart both Connect App and Project Service if they were running.
244
+
245
+
### Import metadata from api.topcoder-dev.com (deprecated)
246
+
247
+
```bash
248
+
CONNECT_USER_TOKEN=<connect user token> npm run demo-data
249
+
```
250
+
To retrieve data from DEV env we have to provide a valid user token (`CONNECT_USER_TOKEN`). You may login to http://connect.topcoder-dev.com and find the Bearer token in the request headers using browser dev tools.
251
+
252
+
This command for importing data uses API to create demo data. Which has a few pecularities:
253
+
- data in DB would be for sure created
254
+
- data in ElasticSearch Index (ES) would be only created if services [project-processor-es](https://github.com/topcoder-platform/project-processor-es) and [tc-bus-api](https://github.com/topcoder-platform/tc-bus-api) are also started locally. If you don't start them, then imported data wouldn't be indexed in ES, and would be only added to DB. You may start them locally separately, or better use `local/full/docker-compose.yml` as described [next section](#local-deployment-with-other-topcoder-services) which would start them automatically.
255
+
-**NOTE** During data importing a lot of records has to be indexed in ES, so you have to wait about 5-10 minutes after `npm run demo-data` is finished until imported data is indexed in ES. You may watch logs of `project-processor-es` to see if its done or no.
0 commit comments