Skip to content

Commit dee5668

Browse files
author
Maksym Mykhailenko
committed
docs: improve readme using import/export scripts
ref issue #257
1 parent f06e245 commit dee5668

File tree

2 files changed

+124
-116
lines changed

2 files changed

+124
-116
lines changed

README.md

Lines changed: 120 additions & 114 deletions
Original file line numberDiff line numberDiff line change
@@ -4,22 +4,24 @@ Microservice to manage CRUD operations for all things Projects.
44

55
**Note : Steps mentioned below are best to our capability as guide for local deployment, however, we expect from contributor, being a developer, to resolve run-time issues (e.g. OS and node version issues etc), if any.**
66

7-
- [Local Development](#local-development)
8-
- [Requirements](#requirements)
9-
- [Steps to run locally](#steps-to-run-locally)
10-
- [Export database to json file](#export-database-to-json-file)
11-
- [Import database from json file, and index it](#import-database-from-json-file-and-index-it)
12-
- [Import sample metadata projects(Using projects service apis)](#import-sample-metadata-projectsusing-projects-service-apis)
13-
- [Run Connect App with Project Service locally](#run-connect-app-with-project-service-locally)
14-
- [Test](#test)
15-
- [JWT Authentication](#jwt-authentication)
16-
- [Deploying with docker (might need updates)](#deploying-with-docker-might-need-updates)
17-
- [Kafka commands](#kafka-commands)
18-
- [Create Topic](#create-topic)
19-
- [List Topics](#list-topics)
20-
- [Watch Topic](#watch-topic)
21-
- [Post Message to Topic (from stdin)](#post-message-to-topic-from-stdin)
22-
- [References](#references)
7+
- [Topcoder Projects Service](#topcoder-projects-service)
8+
- [Local Development](#local-development)
9+
- [Requirements](#requirements)
10+
- [Steps to run locally](#steps-to-run-locally)
11+
- [Import and Export data](#import-and-export-data)
12+
- [📤 Export data](#%f0%9f%93%a4-export-data)
13+
- [📥 Import data](#%f0%9f%93%a5-import-data)
14+
- [Run Connect App with Project Service locally](#run-connect-app-with-project-service-locally)
15+
- [Import metadata from api.topcoder-dev.com (deprecated)](#import-metadata-from-apitopcoder-devcom-deprecated)
16+
- [Test](#test)
17+
- [JWT Authentication](#jwt-authentication)
18+
- [Deploying with docker (might need updates)](#deploying-with-docker-might-need-updates)
19+
- [Kafka commands](#kafka-commands)
20+
- [Create Topic](#create-topic)
21+
- [List Topics](#list-topics)
22+
- [Watch Topic](#watch-topic)
23+
- [Post Message to Topic (from stdin)](#post-message-to-topic-from-stdin)
24+
- [References](#references)
2325

2426
## Local Development
2527

@@ -32,13 +34,13 @@ Local setup should work good on **Linux** and **macOS**. But **Windows** is not
3234
* Install [libpg](https://www.npmjs.com/package/pg-native)
3335

3436
### Steps to run locally
35-
1. Install node dependencies
37+
1. 📦 Install npm dependencies
3638

3739
```bash
3840
npm install
3941
```
4042

41-
2. Local config
43+
2. Local config
4244

4345
1. In the `tc-project-service` root directory create `.env` file with the environment variables _(values should be shared with you on the forum)_:<br>
4446
```
@@ -48,7 +50,7 @@ Local setup should work good on **Linux** and **macOS**. But **Windows** is not
4850
AUTH0_AUDIENCE=...
4951
AUTH0_PROXY_SERVER_URL=...
5052
```
51-
Values from this file would be automatically used by `docker-compose` and command `npm run start:dev` below.
53+
Values from this file would be automatically used by `docker-compose` , command `npm run start:dev` and some other command during local development.
5254
5355
2. Copy config file `config/m2m.local.js` into `config/local.js`:
5456
```bash
@@ -62,68 +64,40 @@ Local setup should work good on **Linux** and **macOS**. But **Windows** is not
6264
6365
Alternatively, you may update `config/local.js` and replace `dockerhost` with your docker IP address.
6466
65-
3. Start **ONE** of the docker-compose files with dependant services which are required for Project Service to work
67+
3. 🚢 Start docker-compose with services which are required to start Project Service locally
6668
67-
1. **Minimal** `./local/docker-compose.yml`:
68-
69-
*Use this docker-compose if you only want to test and modify code of Project Service and you don't need Elasticsearch (ES) to work.*
70-
71-
Run, in the project root folder:
72-
```bash
73-
docker-compose -f local/docker-compose.yml up
74-
```
75-
76-
<details><summary>Click to see details</summary>
77-
<br>
78-
79-
This docker-compose file starts the next services:
80-
| Service | Name | Port |
81-
|----------|:-----:|:----:|
82-
| PostgreSQL | db | 5432 |
83-
| Elasticsearch | esearch | 9200 |
84-
| RabbitMQ | queue | 5672, 15672 |
85-
| Mock Service (not in use) | jsonserver | 3001 |
86-
87-
</details>
88-
89-
2. **Full** `./local/full/docker-compose.yml`:
69+
```bash
70+
npm run local:run-docker
71+
```
9072

91-
*Use this docker-compose if you want to test and modify code of Project Service together with one of the next relative services: [tc-bus-api](https://github.com/topcoder-platform/tc-bus-api), [project-processor-es](https://github.com/topcoder-platform/project-processor-es), [tc-notifications](https://github.com/topcoder-platform/tc-notifications) or you need Elasticsearch (ES) to work.*
73+
Wait until all containers are fully started. As a good indicator, wait until `project-processor-es` successfully started by viewing its logs:
9274

93-
1. Run, in the project root folder:
75+
```bash
76+
docker-compose -f local/full/docker-compose.yml logs -f project-processor-es
77+
```
9478

95-
```bash
96-
docker-compose -f local/full/docker-compose.yml up -d
97-
```
79+
<details><summary>Click to see a good logs example</summary>
80+
<br>
9881

99-
2. Wait until all containers are fully started. As a good indicator, wait until `project-processor-es` successfully started by viewing its logs:
82+
- first it would be waiting for `kafka-client` to create all the required topics and exit, you would see:
10083

101-
```bash
102-
docker-compose -f local/full/docker-compose.yml logs -f project-processor-es
84+
```
85+
project-processor-es_1 | Waiting for kafka-client to exit....
10386
```
10487

105-
<details><summary>Click to see example logs</summary>
106-
<br>
107-
108-
- first it would be waiting for `kafka-client` to create all the required topics and exit, you would see:
109-
110-
```
111-
project-processor-es_1 | Waiting for kafka-client to exit....
112-
```
113-
114-
- after that, `project-processor-es` would be started itself. Make sure it successfully connected to Kafka, you should see 3 lines with text `Subscribed to project.action.`:
115-
116-
```
117-
project-processor-es_1 | 2020-02-19T03:18:46.523Z DEBUG no-kafka-client Subscribed to project.action.update:0 offset 0 leader kafka:9093
118-
project-processor-es_1 | 2020-02-19T03:18:46.524Z DEBUG no-kafka-client Subscribed to project.action.delete:0 offset 0 leader kafka:9093
119-
project-processor-es_1 | 2020-02-19T03:18:46.528Z DEBUG no-kafka-client Subscribed to project.action.create:0 offset 0 leader kafka:9093
120-
```
121-
</details>
88+
- after that, `project-processor-es` would be started itself. Make sure it successfully connected to Kafka, you should see 3 lines with text `Subscribed to project.action.`:
12289

123-
3. If you want to modify the code of any of the services which are run inside this docker-compose file, you can stop such service inside docker-compose by command `docker-compose -f local/full/docker-compose.yml stop -f <SERVICE_NAME>` and run the service separately, following its README file.
90+
```
91+
project-processor-es_1 | 2020-02-19T03:18:46.523Z DEBUG no-kafka-client Subscribed to project.action.update:0 offset 0 leader kafka:9093
92+
project-processor-es_1 | 2020-02-19T03:18:46.524Z DEBUG no-kafka-client Subscribed to project.action.delete:0 offset 0 leader kafka:9093
93+
project-processor-es_1 | 2020-02-19T03:18:46.528Z DEBUG no-kafka-client Subscribed to project.action.create:0 offset 0 leader kafka:9093
94+
```
95+
</details>
12496
125-
<details><summary>Click to see details</summary>
126-
<br>
97+
<br>
98+
If you want to learn more about docker-compose configuration
99+
<details><summary>see more details here</summary>
100+
<br>
127101
128102
This docker-compose file starts the next services:
129103
| Service | Name | Port |
@@ -139,7 +113,7 @@ Local setup should work good on **Linux** and **macOS**. But **Windows** is not
139113
| [tc-notifications-api](https://github.com/topcoder-platform/tc-notifications) | tc-notifications-api | 4000 |
140114
| [tc-notifications-processor](https://github.com/topcoder-platform/tc-notifications) | tc-notifications-processor | 4001 |
141115
142-
- as many of the Topcoder services which are run in this docker-compose require Auth0 configuration for M2M calls, that's why if we want to start this docker-compose file, we have to set environment variables `AUTH0_CLIENT_ID`, `AUTH0_CLIENT_SECRET`, `AUTH0_URL`, `AUTH0_AUDIENCE`, `AUTH0_PROXY_SERVER_URL` first and they would be passed inside containers.
116+
- as many of the Topcoder services in this docker-compose require Auth0 configuration for M2M calls, our docker-compose file passes environment variables `AUTH0_CLIENT_ID`, `AUTH0_CLIENT_SECRET`, `AUTH0_URL`, `AUTH0_AUDIENCE`, `AUTH0_PROXY_SERVER_URL` to its containers. docker-compose takes them from `.env` file if provided.
143117
144118
- `docker-compose` automatically would create Kafka topics which are used by `tc-project-service` listed in `local/full/kafka-client/topics.txt`.
145119
@@ -149,25 +123,46 @@ Local setup should work good on **Linux** and **macOS**. But **Windows** is not
149123
docker-compose -f local/full/docker-compose.yml logs -f SERVICE_NAME
150124
```
151125
152-
</details>
126+
- If you want to modify the code of any of the services which are run inside this docker-compose file, you can stop such service inside docker-compose by command `docker-compose -f local/full/docker-compose.yml stop -f <SERVICE_NAME>` and run the service separately, following its README file.
153127
154-
*NOTE: In production these dependencies / services are hosted & managed outside Project Service.*
128+
- We also have a minimal docker-compose which doesn't start all the required services. Use it only if are sure that you don't need all the services.
129+
130+
<details><summary>Click to see details about minimal docker-compose</summary>
131+
<br>
132+
133+
*Use this docker-compose if you only want to test and modify code of Project Service and you don't need Elasticsearch (ES) to work.*
134+
135+
Run, in the project root folder:
136+
```bash
137+
docker-compose -f local/docker-compose.yml up -d
138+
```
139+
140+
This docker-compose file starts the next services:
141+
| Service | Name | Port |
142+
|----------|:-----:|:----:|
143+
| PostgreSQL | db | 5432 |
144+
| Elasticsearch | esearch | 9200 |
145+
| RabbitMQ | queue | 5672, 15672 |
146+
| Mock Service (not in use) | jsonserver | 3001 |
155147
156-
4. Create tables in DB
157-
```bash
158-
NODE_ENV=development npm run sync:db
159-
```
148+
</details>
160149
161-
*NOTE: this will drop tables if they already exist.*
150+
</details>
162151
163-
5. Create ES (Elasticsearch) indexes
164-
```bash
165-
NODE_ENV=development npm run sync:es
166-
```
152+
*NOTE: In production these dependencies / services are hosted & managed outside Project Service.*
153+
154+
4. ♻ Init DB, ES and demo data (it clears any existent data)
155+
156+
```bash
157+
npm run local:setup
158+
```
167159

168-
*NOTE: This will first clear all the indices and than recreate them. So use with caution.*
160+
This command will do 3 things:
161+
- create Database tables (remove if exists)
162+
- create Elasticsearch indexes (remove if exists)
163+
- import demo data from `data/demo-data.json`
169164

170-
6. Start Project Service
165+
5. 🚀 Start Project Service
171166

172167
```bash
173168
npm run start:dev
@@ -176,7 +171,7 @@ Local setup should work good on **Linux** and **macOS**. But **Windows** is not
176171
Runs the Project Service using nodemon, so it would be restarted after any of the files is updated.
177172
The project service will be served on `http://localhost:8001`.
178173

179-
7. *(Optional)* Start Project Service Kafka Consumer
174+
6. *(Optional)* Start Project Service Kafka Consumer
180175

181176
*Run this only if you want to test or modify logic of `lastActivityAt` or `lastActivityBy`.*
182177

@@ -186,56 +181,55 @@ Local setup should work good on **Linux** and **macOS**. But **Windows** is not
186181
npm run startKafkaConsumers:dev
187182
```
188183

189-
### Export database to json file
184+
### Import and Export data
190185

191-
To export data of certain models from database to json file.
186+
#### 📤 Export data
192187

193-
`npm run data:export -- --file path/to-file.json`
194-
195-
List of models that will be exported is defined in `scripts/data/dataModels.js`. You can add new models to this list,
196-
but make sure that new models are added to list such that each model comes after its dependencies.
197-
198-
When we run `npm run data:export` without specifying json file , data will be exported to `data/demo-data.json`.
199-
200-
### Import database from json file, and index it
188+
To export data to the default file `data/demo-data.json`, run:
189+
```bash
190+
npm run data:export
191+
```
201192

202-
To import data of certain models from file to database, and create ES indices.
193+
If you want to export data to another file, run:
203194

204-
`npm run data:import -- --file path/to-file.json`
195+
```bash
196+
npm run data:export -- --file path/to-file.json
197+
```
205198

206-
List of models that will be imported is defined in `scripts/data/dataModels.js`. You can add new models to this list,
207-
but make sure that new models are added to list such that each model comes after its dependencies.
199+
- List of models that will be exported are defined in `scripts/data/dataModels.js`. You can add new models to this list, but make sure that new models are added to list such that each model comes after its dependencies.
208200

209-
When we run `npm run data:import` without specifying json file , data will be imported from `data/demo-data.json`.
210-
There is sample file located at `data/demo-data.json` that can used to import and index sample data.
201+
#### 📥 Import data
211202

212-
Because this commands calls topcoder services to get data like members details, so you have to set environment variables AUTH0_CLIENT_ID, AUTH0_CLIENT_SECRET, AUTH0_URL, AUTH0_AUDIENCE, AUTH0_PROXY_SERVER_URL
203+
*During importing, data would be first imported to the database, and after from the database it would be indexed to the Elasticsearch index.*
213204

214-
If you encounter conflicts errors, you may need to clear database using `npm run sync:db`, and clear ES (Elasticsearch) indices using `NODE_ENV=development npm run sync:es`
205+
To import data from the default file `data/demo-data.json`, run:
206+
```bash
207+
npm run data:import
208+
```
215209

216-
### Import sample metadata projects(Using projects service apis)
210+
If you want to import data from another file, run:
217211

218212
```bash
219-
CONNECT_USER_TOKEN=<connect user token> npm run demo-data
213+
npm run data:import -- --file path/to-file.json
220214
```
221-
To retrieve data from DEV env we have to provide a valid user token (`CONNECT_USER_TOKEN`). You may login to http://connect.topcoder-dev.com and find the Bearer token in the request headers using browser dev tools.
222215

223-
This command for importing data uses API to create demo data. Which has a few pecularities:
224-
- data in DB would be for sure created
225-
- data in ElasticSearch Index (ES) would be only created if services [project-processor-es](https://github.com/topcoder-platform/project-processor-es) and [tc-bus-api](https://github.com/topcoder-platform/tc-bus-api) are also started locally. If you don't start them, then imported data wouldn't be indexed in ES, and would be only added to DB. You may start them locally separately, or better use `local/full/docker-compose.yml` as described [next section](#local-deployment-with-other-topcoder-services) which would start them automatically.
226-
- **NOTE** During data importing a lot of records has to be indexed in ES, so you have to wait about 5-10 minutes after `npm run demo-data` is finished until imported data is indexed in ES. You may watch logs of `project-processor-es` to see if its done or no.
216+
- As this commands calls topcoder services to get data like members details, so you have to provide environment variables `AUTH0_CLIENT_ID`, `AUTH0_CLIENT_SECRET`, `AUTH0_URL`, `AUTH0_AUDIENCE`, `AUTH0_PROXY_SERVER_URL`, they would automatically picked up from the `.env` file if provided.
217+
218+
- If you encounter conflicts errors during import, you may need to clear database using `NODE_ENV=development npm run sync:db`, and clear ES (Elasticsearch) indices using `NODE_ENV=development npm run sync:es`.
219+
220+
- List of models that will be imported are defined in `scripts/data/dataModels.js`. You can add new models to this list, but make sure that new models are added to list such that each model comes after its dependencies.
227221

228222
### Run Connect App with Project Service locally
229223

230224
To be able to run [Connect App](https://github.com/appirio-tech/connect-app) with the local setup of Project Service we have to do two things:
231-
1. Configurate Connect App to use locally deployed Project service inside `connect-app/config/constants/dev.js` set
225+
1. Configure Connect App to use locally deployed Project service inside `connect-app/config/constants/dev.js` set
232226

233227
```js
234228
PROJECTS_API_URL: 'http://localhost:8001'
235229
TC_NOTIFICATION_URL: 'http://localhost:4000/v5/notifications' # if tc-notfication-api has been locally deployed
236230
```
237231

238-
2. Bypass token validation in Project Service.
232+
1. Bypass token validation in Project Service.
239233

240234
In `tc-project-service/node_modules/tc-core-library-js/lib/auth/verifier.js` add this to line 23:
241235
```js
@@ -246,7 +240,19 @@ To be able to run [Connect App](https://github.com/appirio-tech/connect-app) wit
246240

247241
*NOTE: this change only let us bypass validation during local development process*.
248242

249-
3. Restart both Connect App and Project Service if they were running.
243+
2. Restart both Connect App and Project Service if they were running.
244+
245+
### Import metadata from api.topcoder-dev.com (deprecated)
246+
247+
```bash
248+
CONNECT_USER_TOKEN=<connect user token> npm run demo-data
249+
```
250+
To retrieve data from DEV env we have to provide a valid user token (`CONNECT_USER_TOKEN`). You may login to http://connect.topcoder-dev.com and find the Bearer token in the request headers using browser dev tools.
251+
252+
This command for importing data uses API to create demo data. Which has a few pecularities:
253+
- data in DB would be for sure created
254+
- data in ElasticSearch Index (ES) would be only created if services [project-processor-es](https://github.com/topcoder-platform/project-processor-es) and [tc-bus-api](https://github.com/topcoder-platform/tc-bus-api) are also started locally. If you don't start them, then imported data wouldn't be indexed in ES, and would be only added to DB. You may start them locally separately, or better use `local/full/docker-compose.yml` as described [next section](#local-deployment-with-other-topcoder-services) which would start them automatically.
255+
- **NOTE** During data importing a lot of records has to be indexed in ES, so you have to wait about 5-10 minutes after `npm run demo-data` is finished until imported data is indexed in ES. You may watch logs of `project-processor-es` to see if its done or no.
250256

251257
## Test
252258
```bash

package.json

Lines changed: 4 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -27,8 +27,10 @@
2727
"seed": "babel-node src/tests/seed.js --presets es2015",
2828
"demo-data": "babel-node local/seed",
2929
"es-db-compare": "babel-node scripts/es-db-compare",
30-
"data:export": "LOG_LEVEL=info babel-node scripts/data/export | bunyan -o short",
31-
"data:import": "LOG_LEVEL=info babel-node scripts/data/import | bunyan -o short"
30+
"data:export": "LOG_LEVEL=info node --require dotenv/config --require babel-core/register scripts/data/export | bunyan -o short",
31+
"data:import": "LOG_LEVEL=info node --require dotenv/config --require babel-core/register scripts/data/import | bunyan -o short",
32+
"local:run-docker": "docker-compose -f ./local/full/docker-compose.yml up -d",
33+
"local:setup": "NODE_ENV=development npm run sync:db && NODE_ENV=development npm run sync:es && npm run data:import"
3234
},
3335
"repository": {
3436
"type": "git",

0 commit comments

Comments
 (0)