Skip to content

Commit 04d3faa

Browse files
committed
feat: improved README to be easier to follow during local setup
1 parent 3a2bcd8 commit 04d3faa

File tree

1 file changed

+121
-78
lines changed

1 file changed

+121
-78
lines changed

README.md

Lines changed: 121 additions & 78 deletions
Original file line numberDiff line numberDiff line change
@@ -7,6 +7,122 @@
77
- ElasticSearch
88
- Docker, Docker Compose
99

10+
## Local setup
11+
12+
1. Install node dependencies:
13+
14+
```bash
15+
npm install
16+
```
17+
18+
2. Run docker compose with dependant services:
19+
20+
```bash
21+
cd local/
22+
docker-compose up
23+
```
24+
25+
<details><summary>Click to see details</summary>
26+
27+
This docker-compose run all the dependencies which are necessary for `project-processor-es` to work.
28+
29+
| Service | Name | Port |
30+
|----------|:-----:|:----:|
31+
| Elasticsearch | esearch | 9200 |
32+
| Zookeeper | zookeeper | 2181 |
33+
| Kafka | kafka | 9092 |
34+
35+
`docker-compose` automatically creates Kafka topics which are used by `project-processor-es` listed in `local/kafka-client/topics.txt`.
36+
37+
</details>
38+
39+
40+
3. Set environment variables for M2M authentication: `AUTH0_CLIENT_ID`, `AUTH0_CLIENT_SECRET`, `AUTH0_URL`, `AUTH0_AUDIENCE`, `AUTH0_PROXY_SERVER_URL`:
41+
42+
```bash
43+
export AUTH0_CLIENT_ID=<insert required value here>
44+
export AUTH0_CLIENT_SECRET=<insert required value here>
45+
export AUTH0_URL=<insert required value here>
46+
export AUTH0_AUDIENCE=<insert required value here>
47+
export AUTH0_PROXY_SERVER_URL=<insert required value here>
48+
```
49+
50+
4. Initialize Elasticsearch indexes:
51+
52+
```bash
53+
npm run sync:es
54+
```
55+
56+
5. Start processor app:
57+
58+
```bash
59+
npm start
60+
```
61+
62+
## Commands
63+
64+
### Lint & Tests commands
65+
66+
| Command | Description |
67+
|----------|--------------|
68+
| `npm run lint` | Run lint check. |
69+
| `npm run lin:fix` | Run lint check with automatic fixing of errors and warnings where possible. |
70+
| `npm run test` | Run integration tests. |
71+
| `npm run test:cov` | Run integration tests with coverage report. |
72+
73+
### View data in Elasticsearch indexes
74+
75+
You may run the next command to output documents in the Elasticsearch indexes for debugging purposes.
76+
77+
```bash
78+
npm run view-data <INDEX_NAME> <DOCUMENT_ID>
79+
```
80+
81+
##### Examples
82+
83+
- `npm run view-data projects 1` view document with id `1` in `projects` index
84+
- `npm run view-data timelines 1` view document with id `1` in `timelines` index
85+
- `npm run view-data metadata 1` view document with id `1` in `timelines` index *(this index has only one document and all the data is stored inside one document which might be very big)*.
86+
87+
### Kafka commands
88+
89+
If you've used `docker-compose` with the file `local/docker-compose.yml` during local setup to spawn kafka & zookeeper, you can use the following commands to manipulate kafka topics and messages:
90+
(Replace `TOPIC_NAME` with the name of the desired topic)
91+
92+
#### Create Topic
93+
94+
```bash
95+
docker exec project-processor-es-kafka /opt/kafka/bin/kafka-topics.sh --create --zookeeper zookeeper:2181 --partitions 1 --replication-factor 1 --topic TOPIC_NAME
96+
```
97+
98+
#### List Topics
99+
100+
```bash
101+
docker exec project-processor-es-kafka /opt/kafka/bin/kafka-topics.sh --list --zookeeper zookeeper:2181
102+
```
103+
104+
#### Watch Topic
105+
106+
```bash
107+
docker exec project-processor-es-kafka /opt/kafka/bin/kafka-console-consumer.sh --bootstrap-server localhost:9092 --topic TOPIC_NAME
108+
```
109+
110+
#### Post Message to Topic (from stdin)
111+
112+
```bash
113+
docker exec -it project-processor-es-kafka /opt/kafka/bin/kafka-console-producer.sh --broker-list localhost:9092 --topic TOPIC_NAME
114+
```
115+
116+
- Enter or copy/paste the message into the console after starting this command.
117+
118+
#### Post Message to Topic (from file)
119+
120+
```bash
121+
docker exec -i project-processor-es-kafka /opt/kafka/bin/kafka-console-producer.sh --topic project.action.create --broker-list localhost:9092 < test_message.json
122+
```
123+
124+
- All example for messages are in: `./test/data`.
125+
10126
## Configuration
11127

12128
Configuration for the processor is at `config/default.js`.
@@ -40,40 +156,6 @@ Also note that there is a `/health` endpoint that checks for the health of the a
40156

41157
Config for tests are at `config/test.js`, it overrides some default config.
42158

43-
44-
### Local Deployment for Kafka
45-
46-
* There exists an alternate `docker-compose.yml` file that can be used to spawn containers for the following services:
47-
48-
| Service | Name | Port |
49-
|----------|:-----:|:----:|
50-
| ElasticSearch | esearch | 9200 |
51-
| Zookeeper | zookeeper | 2181 |
52-
| Kafka | kafka | 9092 |
53-
54-
* To have kafka create a list of desired topics on startup, there exists a file with the path `local/kafka-client/topics.txt`. Each line from the file will be added as a topic.
55-
* To run these services simply run the following commands:
56-
57-
```bash
58-
cd local
59-
docker-compose up -d
60-
```
61-
62-
## Local deployment
63-
- Install dependencies `npm i`
64-
- Run code lint check `npm run lint`, running `npm run lint:fix` can fix some lint errors if any
65-
- Initialize Elasticsearch, create configured Elasticsearch index if not present: `npm run sync:es`
66-
- Start processor app `npm start`
67-
68-
Note that you need to set AUTH0 related environment variables belows before you can start the processor.
69-
70-
- AUTH0_URL
71-
- AUTH0_AUDIENCE
72-
- TOKEN_CACHE_TIME
73-
- AUTH0_CLIENT_ID
74-
- AUTH0_CLIENT_SECRET
75-
- AUTH0_PROXY_SERVER_URL
76-
77159
## Local Deployment with Docker
78160

79161
To run the Challenge ES Processor using docker, follow the below steps
@@ -119,10 +201,10 @@ npm run test:cov
119201
```
120202

121203
## Verification
122-
- Call our project root directory : `our_project_root_directory`
123-
- Start Docker servicees, initialize Elasticsearch, start processor app
204+
- Start Docker services, initialize Elasticsearch, start processor app
205+
- Navigate to the repository root directory.
124206
- Send message:
125-
`docker exec -i project-processor-es-kafka /opt/kafka/bin/kafka-console-producer.sh --topic project.action.create --broker-list localhost:9092 < our_project_root_directory/test/data/project/project.action.create.json`
207+
`docker exec -i project-processor-es-kafka /opt/kafka/bin/kafka-console-producer.sh --topic project.action.create --broker-list localhost:9092 < ./test/data/project/project.action.create.json`
126208
- run command `npm run view-data projects 1` to view the created data, you will see the data are properly created:
127209

128210
```bash
@@ -187,7 +269,7 @@ info: {
187269

188270
- Sent message to update data:
189271

190-
`docker exec -i project-processor-es-kafka /opt/kafka/bin/kafka-console-producer.sh --topic project.action.update --broker-list localhost:9092 < our_project_root_directory/test/data/project/project.action.update.json`
272+
`docker exec -i project-processor-es-kafka /opt/kafka/bin/kafka-console-producer.sh --topic project.action.update --broker-list localhost:9092 < ./test/data/project/project.action.update.json`
191273
- Run command `npm run view-data projects 1` to view the updated data, you will see the data are properly updated:
192274

193275
```bash
@@ -242,7 +324,7 @@ info: {
242324

243325

244326
- Run the producer and then write some invalid message into the console to send to the `project.action.create` topic:
245-
`docker exec -it project-processor-es-kafka /opt/kafka/bin/kafka-console-producer.sh --topic project.action.create`
327+
`docker exec -it project-processor-es-kafka /opt/kafka/bin/kafka-console-producer.sh --broker-list localhost:9092 --topic project.action.create`
246328
in the console, write message, one message per line:
247329
`{ "topic": "project.action.update", "originator": "project-api", "timestamp": "2019-02-17T01:00:00", "mime-type": "application/json", "payload": { "id": "173803d3-019e-4033-b1cf-d7205c7f774c", "typeId": "123", "track": "Code", "name": "test3", "description": "desc3", "timelineTemplateId": "8e17090c-465b-4c17-b6d9-dfa16300b0dd", "groups": ["group2", "group3"], "updated": "2019-02-17T01:00:00", "updatedBy": "admin" } }`
248330

@@ -254,42 +336,3 @@ info: {
254336

255337
- To test the health check API, run `export PORT=5000`, start the processor, then browse `http://localhost:5000/health` in a browser,
256338
and you will see result `{"checksRun":1}`
257-
258-
259-
260-
### Kafka Commands
261-
262-
If you've used `docker-compose` with the file `local/docker-compose.yml` to spawn kafka & zookeeper, you can use the following commands to manipulate kafka topics and messages:
263-
(Replace TOPIC_NAME with the name of the desired topic)
264-
265-
**Create Topic**
266-
267-
```bash
268-
docker exec project-processor-es-kafka /opt/kafka/bin/kafka-topics.sh --create --zookeeper zookeeper:2181 --partitions 1 --replication-factor 1 --topic TOPIC_NAME
269-
```
270-
271-
**List Topics**
272-
273-
```bash
274-
docker exec project-processor-es-kafka /opt/kafka/bin/kafka-topics.sh --list --zookeeper zookeeper:2181
275-
```
276-
277-
**Watch Topic**
278-
279-
```bash
280-
docker exec project-processor-es-kafka /opt/kafka/bin/kafka-console-consumer.sh --bootstrap-server localhost:9092 --zookeeper zookeeper:2181 --topic TOPIC_NAME
281-
```
282-
283-
**Post Message to Topic**
284-
285-
```bash
286-
docker exec -it project-processor-es-kafka /opt/kafka/bin/kafka-console-producer.sh --topic TOPIC_NAME --broker-list localhost:9092
287-
```
288-
The message can be passed using `stdin`
289-
290-
### Test
291-
```bash
292-
docker exec -i project-processor-es-kafka /opt/kafka/bin/kafka-console-producer.sh --topic project.action.create --broker-list localhost:9092 < test_message.json
293-
294-
```
295-
All example for messages are in: our_project_root_directory/test/data

0 commit comments

Comments
 (0)