You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Microservice to manage CRUD operations for all things Projects.
4
4
5
-
### Note : Steps mentioned below are best to our capability as guide for local deployment, however, we expect from contributor, being a developer, to resolve run-time issues (e.g. OS and node version issues etc), if any.
5
+
**Note : Steps mentioned below are best to our capability as guide for local deployment, however, we expect from contributor, being a developer, to resolve run-time issues (e.g. OS and node version issues etc), if any.**
-[Run Connect App with Project Service locally](#run-connect-app-with-project-service-locally)
12
+
-[Test](#test)
13
+
-[JWT Authentication](#jwt-authentication)
14
+
-[Deploying with docker (might need updates)](#deploying-with-docker-might-need-updates)
15
+
-[Kafka commands](#kafka-commands)
16
+
-[Create Topic](#create-topic)
17
+
-[List Topics](#list-topics)
18
+
-[Watch Topic](#watch-topic)
19
+
-[Post Message to Topic (from stdin)](#post-message-to-topic-from-stdin)
20
+
-[References](#references)
6
21
7
22
## Local Development
8
23
24
+
Local setup should work good on **Linux** and **macOS**. But **Windows** is not supported at the moment.
25
+
9
26
### Requirements
10
27
11
28
*[docker-compose](https://docs.docker.com/compose/install/) - We use docker-compose for running dependencies locally.
@@ -14,78 +31,158 @@ Microservice to manage CRUD operations for all things Projects.
14
31
15
32
### Steps to run locally
16
33
1. Install node dependencies
34
+
17
35
```bash
18
36
npm install
19
37
```
20
38
21
-
* Run docker with dependant services
22
-
```bash
23
-
cd local/
24
-
docker-compose up
25
-
```
26
-
This will run several services locally:
27
-
-`postgres` - two instances: for app and for unit tests
28
-
-`elasticsearch`
29
-
-`rabbitmq`
30
-
-`mock-services` - mocks some Topcoder API
39
+
3. Start **ONE** of the docker-compose files with dependant services which are required for Project Service to work
40
+
41
+
1.**Minimal**`./local/docker-compose.yml`:
42
+
43
+
*Use this docker-compose if you only want to test and modify code of Project Service and you don't need Elasticsearch (ES) to work.*
44
+
45
+
Run, inside folder `./local`:
46
+
```bash
47
+
docker-compose up
48
+
```
49
+
50
+
<details><summary>Click to see details</summary>
51
+
<br>
52
+
53
+
This docker-compose file starts the next services:
54
+
| Service | Name | Port |
55
+
|----------|:-----:|:----:|
56
+
| PostgreSQL | db | 5432 |
57
+
| PostgreSQL (for tests) | db_test | 5432 |
58
+
| Elasticsearch | esearch | 9200 |
59
+
| RabbitMQ | queue | 5672, 15672 |
60
+
| Mock Service (not in use) | jsonserver | 3001 |
61
+
62
+
</details>
63
+
64
+
2. **Full**`./local/full/docker-compose.yml`:
65
+
66
+
*Use this docker-compose if you want to test and modify code of Project Service together with one of the next relative services: [tc-bus-api](https://github.com/topcoder-platform/tc-bus-api), [project-processor-es](https://github.com/topcoder-platform/project-processor-es), [tc-notifications](https://github.com/topcoder-platform/tc-notifications) or you need Elasticsearch (ES) to work.*
67
+
68
+
1. Set environment variables `AUTH0_CLIENT_ID`, `AUTH0_CLIENT_SECRET`, `AUTH0_URL`, `AUTH0_AUDIENCE`, `AUTH0_PROXY_SERVER_URL`
69
+
2. Run, inside folder `./local/full`
70
+
71
+
```bash
72
+
docker-compose up -d
73
+
```
74
+
75
+
3. Wait until all containers are fully started. As a good indicator, waituntil`project-processor-es` successfully started by viewing its logs:
76
+
77
+
```bash
78
+
docker-compose logs -f project-processor-es
79
+
```
80
+
81
+
<details><summary>Click to see example logs</summary>
82
+
<br>
83
+
84
+
- first it would be waiting for`kafka-client` to create all the required topics and exit, you would see:
85
+
86
+
```
87
+
project-processor-es_1 | Waiting for kafka-client to exit....
88
+
```
89
+
90
+
- after that, `project-processor-es` would be started itself. Make sure it successfully connected to Kafka, you should see 3 lines with text `Subscribed to project.action.`:
*NOTE: In production these dependencies / services are hosted & managed outside tc-projects-service.*
99
+
4. If you want to modify the code of any of the services which are run inside this docker-compose file, you can stop such service inside docker-compose by command`docker-compose stop -f <SERVICE_NAME>` and run the service separately, following its README file.
33
100
34
-
* Local config
101
+
<details><summary>Click to see details</summary>
102
+
<br>
35
103
36
-
There are two prepared configs:
37
-
- if you have M2M environment variables provided: `AUTH0_CLIENT_ID`, `AUTH0_CLIENT_SECRET`, `AUTH0_URL`, `AUTH0_AUDIENCE`, `AUTH0_PROXY_SERVER_URL` then use `config/m2m.local.js`
38
-
- otherwise use `config/mock.local.js`.
104
+
This docker-compose file starts the next services:
To apply any of these config copy it to `config/local.js`:
119
+
- as many of the Topcoder services which are run in this docker-compose require Auth0 configuration for M2M calls, that's why if we want to start this docker-compose file, we have to set environment variables `AUTH0_CLIENT_ID`, `AUTH0_CLIENT_SECRET`, `AUTH0_URL`, `AUTH0_AUDIENCE`, `AUTH0_PROXY_SERVER_URL` first and they would be passed inside containers.
41
120
42
-
```bash
43
-
cp config/mock.local.js config/local.js
44
-
# or
45
-
cp config/m2m.local.js config/local.js
46
-
```
121
+
- `docker-compose` automatically would create Kafka topics which are used by `tc-project-service` listed in `local/full/kafka-client/topics.txt`.
47
122
48
-
`config/local.js` has a prepared configuration which would replace values no matter what `NODE_ENV`value is.
123
+
- To view the logs from any container inside docker-compose use the following command, replacing `SERVICE_NAME` with the corresponding value under the **Name** column in the above table:
49
124
50
-
**IMPORTANT** These configuration files assume that docker containers are run on domain `dockerhost`. Depend on your system you have to make sure that domain `dockerhost` points to the IP address of docker.
51
-
For example, you can add a the next line to your `/etc/hosts` file, if docker is run on IP `127.0.0.1`.
52
-
```
53
-
127.0.0.1 dockerhost
54
-
```
55
-
Alternatively, you may update `config/local.js` and replace `dockerhost` with your docker IP address.<br>
56
-
You may try using command `docker-machine ip` to get your docker IP, but it works not for all systems.
57
-
Also, be sure to update `busApiUrl` if you are running `tc-bus-api` locally. (See below)
125
+
```bash
126
+
cd local/full
127
+
docker-compose logs -f SERVICE_NAME
128
+
```
58
129
59
-
Explanation of configs:
60
-
-`config/mock.local.js` - Use local `mock-services` from docker to mock Identity and Member services instead of using deployed at Topcoder dev environment.
61
-
-`config/m2m.local.js` - Use Identity and Member services deployed at Topcoder dev environment. This can be used only if you have M2M environment variables (`AUTH0_CLIENT_ID`, `AUTH0_CLIENT_SECRET`, `AUTH0_URL`, `AUTH0_AUDIENCE`, `AUTH0_PROXY_SERVER_URL`) provided to access Topcoder DEV environment services.
130
+
</details>
62
131
63
-
* Create tables in DB
64
-
```bash
65
-
NODE_ENV=development npm run sync:db
66
-
```
67
-
This command will crate tables in `postgres` db.
132
+
*NOTE: In production these dependencies / services are hosted & managed outside Project Service.*
68
133
69
-
*NOTE: this will drop tables if they already exist.*
134
+
4. Local config
70
135
71
-
* Sync ES indices
72
-
```bash
73
-
NODE_ENV=development npm run sync:es
74
-
```
75
-
Helper script to sync the indices and mappings with the elasticsearch.
136
+
1. Copy config file `config/m2m.local.js` into `config/local.js`:
137
+
```bash
138
+
cp config/m2m.local.js config/local.js
139
+
```
76
140
77
-
*NOTE: This will first clear all the indices and than recreate them. So use with caution.*
141
+
2. Set `dockerhost` to point the IP address of Docker. Docker IP address depends on your system. For example if docker is run on IP `127.0.0.1` add a the next line to your `/etc/hosts` file:
142
+
```
143
+
127.0.0.1 dockerhost
144
+
```
78
145
79
-
* Run
146
+
Alternatively, you may update `config/local.js` and replace `dockerhost` with your docker IP address.
80
147
81
-
**NOTE** If you use `config/m2m.local.js` config, you should set M2M environment variables before running the next command.
82
-
```bash
83
-
npm run start:dev
84
-
```
85
-
Runs the Project Service using nodemon, so it would be restarted after any of the files is updated.
86
-
The project service will be served on `http://localhost:8001`.
148
+
5. Create tables in DB
149
+
```bash
150
+
NODE_ENV=development npm run sync:db
151
+
```
87
152
88
-
### Import sample metadata & projects
153
+
*NOTE: this will drop tables if they already exist.*
154
+
155
+
6. Create ES (Elasticsearch) indexes
156
+
```bash
157
+
NODE_ENV=development npm run sync:es
158
+
```
159
+
160
+
*NOTE: This will first clear all the indices and than recreate them. So use with caution.*
161
+
162
+
7. Start Project Service
163
+
164
+
1. Set environment variables `AUTH0_CLIENT_ID`, `AUTH0_CLIENT_SECRET`, `AUTH0_URL`, `AUTH0_AUDIENCE`, `AUTH0_PROXY_SERVER_URL`
165
+
166
+
2. Run
167
+
168
+
```bash
169
+
npm run start:dev
170
+
```
171
+
172
+
Runs the Project Service using nodemon, so it would be restarted after any of the files is updated.
173
+
The project service will be served on `http://localhost:8001`.
174
+
175
+
8. *(Optional)* Start Project Service Kafka Consumer
176
+
177
+
*Run this only if you want to test or modify logic of `lastActivityAt` or `lastActivityBy`.*
178
+
179
+
In another terminal window run:
180
+
181
+
```bash
182
+
npm run startKafkaConsumers:dev
183
+
```
184
+
185
+
### Import sample metadata projects
89
186
90
187
```bash
91
188
CONNECT_USER_TOKEN=<connect user token> npm run demo-data
@@ -97,56 +194,6 @@ This command for importing data uses API to create demo data. Which has a few pe
97
194
- data in ElasticSearch Index (ES) would be only created if services [project-processor-es](https://github.com/topcoder-platform/project-processor-es) and [tc-bus-api](https://github.com/topcoder-platform/tc-bus-api) are also started locally. If you don't start them, then imported data wouldn't be indexed in ES, and would be only added to DB. You may start them locally separately, or better use `local/full/docker-compose.yml` as described [next section](#local-deployment-with-other-topcoder-services) which would start them automatically.
98
195
- **NOTE** During data importing a lot of records has to be indexed in ES, so you have to wait about 5-10 minutes after `npm run demo-data` is finished until imported data is indexed in ES. You may watch logs of `project-processor-es` to see if its done or no.
99
196
100
-
### Local Deployment with other Topcoder Services.
101
-
102
-
* There exists an alternate `docker-compose.yml` file that can be used to spawn containers for the following services:
* To have kafka create a list of desired topics on startup, there exists a file with the path `local/full/kafka-client/topics.txt`. Each line from the file will be added as a topic.
117
-
* To run these services simply run the following commands:
118
-
119
-
```bash
120
-
export AUTH0_CLIENT_ID=<insert required value here>
121
-
export AUTH0_CLIENT_SECRET=<insert required value here>
122
-
export AUTH0_URL=<insert required value here>
123
-
export AUTH0_AUDIENCE=<insert required value here>
124
-
export AUTH0_PROXY_SERVER_URL=<insert required value here>
125
-
126
-
cd local/full
127
-
docker-compose up -d
128
-
```
129
-
130
-
* The environment variables specified in the commands above will be passed onto the containers that have been configured to read them.
131
-
* The above command will start all containers in the background.
132
-
* To view the logs of any of the services use the following command, replacing "SERVICE_NAME" with the corresponding value under the "Name" column in the above table:
133
-
134
-
```bash
135
-
cd local/full
136
-
docker-compose logs -f SERVICE_NAME
137
-
```
138
-
139
-
* The containers have been configured such that all Topcoder services will wait until all the topics listed in `local/full/kafka-client/topics.txt` have been created. To monitor the progress of topic creation, you can view the logs of the `kafka-client` service, which will exit when all topics have been created.
140
-
141
-
***WARNING**<br>
142
-
After all the containers are started, make sure that `project-processor-es` service started successfully, as sometimes it doesn't start successfully as Kafka wasn't yet properly started at that moment. So run `docker-compose logs -f project-processor-es` to see its logs, you should see 3 lines with text `Subscribed to project.action.` like:
If you don't see such lines, restart `project-processor-es` service ONLY by running `docker-compose restart project-processor-es`.
149
-
150
197
### Run Connect App with Project Service locally
151
198
152
199
To be able to run [Connect App](https://github.com/appirio-tech/connect-app) with the local setup of Project Service we have to do two things:
@@ -170,22 +217,22 @@ To be able to run [Connect App](https://github.com/appirio-tech/connect-app) wit
170
217
171
218
3. Restart both Connect App and Project Service if they were running.
172
219
173
-
###Test
220
+
## Test
174
221
```bash
175
222
npm run test
176
223
```
177
224
Tests are being executed with the `NODE_ENV` environment variable has a value `test` and `config/test.js` configuration is loaded.
178
225
179
226
Each of the individual modules/services are unit tested.
180
227
181
-
####JWT Authentication
228
+
### JWT Authentication
182
229
Authentication is handled via Authorization (Bearer) token header field. Token is a JWT token. Here is a sample token that is valid for a very long timefor a user with administrator role.
It's been signed with the secret 'secret'. This secret should match your entry in config/local.js. You can generate your own token using https://jwt.io
187
234
188
-
### Local Deployment
235
+
## Deploying with docker (might need updates)
189
236
190
237
**NOTE: This part of README may contain inconsistencies and requires update. Don't follow it unless you know how to properly make configuration for these steps. It's not needed for regular development process.**
191
238
@@ -197,35 +244,37 @@ You may replace 172.17.0.1 with your docker0 IP.
197
244
198
245
You can paste **swagger.yaml** to [swagger editor](http://editor.swagger.io/) or import **postman.json** and **postman_environment.json** to verify endpoints.
199
246
200
-
#### Deploying without docker
201
-
If you don't want to use docker to deploy to localhost. You can simply run `npm run start:dev` from root of project. This should start the server on default port `8001`.
247
+
## Kafka commands
202
248
203
-
### Kafka Commands
249
+
If you've used **Full**`docker-compose` with the file `local/full/docker-compose.yml` during local setup to spawn kafka & zookeeper, you can use the following commands to manipulate kafka topics and messages:
250
+
(Replace `TOPIC_NAME` with the name of the desired topic)
204
251
205
-
If you've used `docker-compose` with the file `local/full/docker-compose.yml` to spawn kafka & zookeeper, you can use the following commands to manipulate kafka topics and messages:
206
-
(Replace TOPIC_NAME with the name of the desired topic)
0 commit comments