Skip to content

Commit 74a2513

Browse files
committed
Improve local setup
1 parent a3a1ab3 commit 74a2513

File tree

6 files changed

+103
-45
lines changed

6 files changed

+103
-45
lines changed

README.md

Lines changed: 62 additions & 39 deletions
Original file line numberDiff line numberDiff line change
@@ -40,40 +40,24 @@ Also note that there is a `/health` endpoint that checks for the health of the a
4040

4141
Config for tests are at `config/test.js`, it overrides some default config.
4242

43-
## Local Kafka setup
44-
- Call extracted directory kafka_2.11-0.11.0.1 : `path_to_kafka`
45-
- Call our project root directory : `our_project_root_directory`
46-
- `http://kafka.apache.org/quickstart` contains details to setup and manage Kafka server,
47-
below provides details to setup Kafka server in Mac, Windows will use bat commands in bin/windows instead
48-
- Download kafka at `https://www.apache.org/dyn/closer.cgi?path=/kafka/1.1.0/kafka_2.11-1.1.0.tgz`
49-
- Extract out the doanlowded tgz file
50-
- Go to extracted directory kafka_2.11-0.11.0.1
51-
- Start ZooKeeper server:
52-
`bin/zookeeper-server-start.sh config/zookeeper.properties`
53-
- Use another terminal, go to same directory, start the Kafka server:
54-
`bin/kafka-server-start.sh config/server.properties`
55-
- Note that the zookeeper server is at localhost:2181, and Kafka server is at localhost:9092
56-
- Use another terminal, go to same directory, create some topics:
57-
`bin/kafka-topics.sh --create --zookeeper localhost:2181 --replication-factor 1 --partitions 1 --topic project.action.create`
58-
`bin/kafka-topics.sh --create --zookeeper localhost:2181 --replication-factor 1 --partitions 1 --topic project.action.update`
59-
`bin/kafka-topics.sh --create --zookeeper localhost:2181 --replication-factor 1 --partitions 1 --topic project.action.delete`
60-
- Verify that the topics are created:
61-
`bin/kafka-topics.sh --list --zookeeper localhost:2181`,
62-
it should list out the created topics
63-
- run the producer and then write some message into the console to send to the `project.action.create` topic:
64-
`bin/kafka-console-producer.sh --broker-list localhost:9092 --topic project.action.create`
65-
in the console, write message, one message per line:
66-
`{"topic":"project.action.create","originator":"project-api","timestamp":"2019-06-20T13:43:25.817Z","mime-type":"application/json","payload":{"resource":"project","createdAt":"2019-06-20T13:43:23.554Z","updatedAt":"2019-06-20T13:43:23.555Z","terms":[],"id":1,"name":"test project","description":"Hello I am a test project","type":"app","createdBy":40051333,"updatedBy":40051333,"projectEligibility":[],"bookmarks":[],"external":null,"status":"draft","lastActivityAt":"2019-06-20T13:43:23.514Z","lastActivityUserId":"40051333","members":[{"createdAt":"2019-06-20T13:43:23.555Z","updatedAt":"2019-06-20T13:43:23.625Z","id":2,"isPrimary":true,"role":"manager","userId":40051333,"updatedBy":40051333,"createdBy":40051333,"projectId":2,"deletedAt":null,"deletedBy":null}],"version":"v2","directProjectId":null,"billingAccountId":null,"estimatedPrice":null,"actualPrice":null,"details":null,"cancelReason":null,"templateId":null,"deletedBy":null,"attachments":null,"phases":null,"projectUrl":"https://connect.topcoder-dev.com/projects/2"}}`
67-
- Optionally, use another terminal, go to same directory, start a consumer to view the messages:
68-
`bin/kafka-console-consumer.sh --bootstrap-server localhost:9092 --topic project.action.create --from-beginning`
69-
- If the kafka don't allow to input long message you can use this script to write message from file:
70-
`path_to_kafka/bin/kafka-console-producer.sh --broker-list localhost:9092 --topic project.action.create < our_project_root_directory/test/data/project/project.action.create.json`
71-
- Writing/reading messages to/from other topics are similar. All example for messages are in:
72-
`our_project_root_directory/test/data`
7343

74-
## Local Elasticsearch setup
44+
### Local Deployment for Kafka
45+
46+
* There exists an alternate `docker-compose.yml` file that can be used to spawn containers for the following services:
47+
48+
| Service | Name | Port |
49+
|----------|:-----:|:----:|
50+
| ElasticSearch | esearch | 9200 |
51+
| Zookeeper | zookeeper | 2181 |
52+
| Kafka | kafka | 9092 |
7553

76-
- In the `docker-es` folder, run `docker-compose up`
54+
* To have kafka create a list of desired topics on startup, there exists a file with the path `local/kafka-client/topics.txt`. Each line from the file will be added as a topic.
55+
* To run these services simply run the following commands:
56+
57+
```bash
58+
cd local
59+
docker-compose up -d
60+
```
7761

7862
## Local deployment
7963
- Install dependencies `npm i`
@@ -135,12 +119,10 @@ npm run test:cov
135119
```
136120

137121
## Verification
138-
139-
- Call extracted directory kafka_2.11-0.11.0.1 : `path_to_kafka`
140122
- Call our project root directory : `our_project_root_directory`
141-
- Start kafka server, start elasticsearch, initialize Elasticsearch, start processor app
123+
- Start Docker servicees, initialize Elasticsearch, start processor app
142124
- Send message:
143-
`path_to_kafka/bin/kafka-console-producer.sh --broker-list localhost:9092 --topic project.action.create < our_project_root_directory/test/data/project/project.action.create.json`
125+
`docker exec -i tc-projects-kafka /opt/kafka//bin/kafka-console-producer.sh --topic project.action.create --broker-list localhost:9092 < our_project_root_directory/test/data/project/project.action.create.json`
144126
- run command `npm run view-data projects 1` to view the created data, you will see the data are properly created:
145127

146128
```bash
@@ -193,7 +175,8 @@ info: {
193175

194176

195177
- Run the producer and then write some invalid message into the console to send to the `project.action.create` topic:
196-
`bin/kafka-console-producer.sh --broker-list localhost:9092 --topic project.action.create`
178+
179+
`docker exec -it tc-projects-kafka /opt/kafka//bin/kafka-console-producer.sh --topic project.action.create --broker-list localhost:9092`
197180
in the console, write message, one message per line:
198181
`{ "topic": "project.action.create", "originator": "project-api", "timestamp": "2019-02-16T00:00:00", "mime-type": "application/json", "payload": { "id": "invalid", "typeId": "8e17090c-465b-4c17-b6d9-dfa16300b0ff", "track": "Code", "name": "test", "description": "desc", "timelineTemplateId": "8e17090c-465b-4c17-b6d9-dfa16300b0aa", "phases": [{ "id": "8e17090c-465b-4c17-b6d9-dfa16300b012", "name": "review", "isActive": true, "duration": 10000 }], "prizeSets": [{ "type": "prize", "prizes": [{ "type": "winning prize", "value": 500 }] }], "reviewType": "code review", "tags": ["code"], "projectId": 123, "forumId": 456, "status": "Active", "created": "2019-02-16T00:00:00", "createdBy": "admin" } }`
199182

@@ -203,7 +186,8 @@ info: {
203186
- Then in the app console, you will see error messages
204187

205188
- Sent message to update data:
206-
`path_to_kafka/bin/kafka-console-producer.sh --broker-list localhost:9092 --topic project.action.update < our_project_root_directory/test/data/project/project.action.update.json`
189+
190+
`docker exec -i tc-projects-kafka /opt/kafka//bin/kafka-console-producer.sh --topic project.action.update --broker-list localhost:9092 < our_project_root_directory/test/data/project/project.action.update.json`
207191
- Run command `npm run view-data projects 1` to view the updated data, you will see the data are properly updated:
208192

209193
```bash
@@ -258,7 +242,7 @@ info: {
258242

259243

260244
- Run the producer and then write some invalid message into the console to send to the `project.action.create` topic:
261-
`bin/kafka-console-producer.sh --broker-list localhost:9092 --topic project.action.create`
245+
`docker exec -it tc-projects-kafka /opt/kafka//bin/kafka-console-producer.sh --topic project.action.create`
262246
in the console, write message, one message per line:
263247
`{ "topic": "project.action.update", "originator": "project-api", "timestamp": "2019-02-17T01:00:00", "mime-type": "application/json", "payload": { "id": "173803d3-019e-4033-b1cf-d7205c7f774c", "typeId": "123", "track": "Code", "name": "test3", "description": "desc3", "timelineTemplateId": "8e17090c-465b-4c17-b6d9-dfa16300b0dd", "groups": ["group2", "group3"], "updated": "2019-02-17T01:00:00", "updatedBy": "admin" } }`
264248

@@ -270,3 +254,42 @@ info: {
270254

271255
- To test the health check API, run `export PORT=5000`, start the processor, then browse `http://localhost:5000/health` in a browser,
272256
and you will see result `{"checksRun":1}`
257+
258+
259+
260+
### Kafka Commands
261+
262+
If you've used `docker-compose` with the file `local/docker-compose.yml` to spawn kafka & zookeeper, you can use the following commands to manipulate kafka topics and messages:
263+
(Replace TOPIC_NAME with the name of the desired topic)
264+
265+
**Create Topic**
266+
267+
```bash
268+
docker exec tc-projects-kafka /opt/kafka/bin/kafka-topics.sh --create --zookeeper zookeeper:2181 --partitions 1 --replication-factor 1 --topic TOPIC_NAME
269+
```
270+
271+
**List Topics**
272+
273+
```bash
274+
docker exec tc-projects-kafka /opt/kafka/bin/kafka-topics.sh --list --zookeeper zookeeper:2181
275+
```
276+
277+
**Watch Topic**
278+
279+
```bash
280+
docker exec tc-projects-kafka /opt/kafka/bin/kafka-console-consumer --bootstrap-server localhost:9092 --zookeeper zookeeper:2181 --topic TOPIC_NAME
281+
```
282+
283+
**Post Message to Topic**
284+
285+
```bash
286+
docker exec -it tc-projects-kafka /opt/kafka/bin/kafka-console-producer --topic TOPIC_NAME --broker-list localhost:9092
287+
```
288+
The message can be passed using `stdin`
289+
290+
### Test
291+
```bash
292+
docker exec -i tc-projects-kafka /opt/kafka//bin/kafka-console-producer.sh --topic project.action.create --broker-list localhost:9092 < test_message.json
293+
294+
```
295+
All example for messages are in: our_project_root_directory/test/data

docker-es/docker-compose.yml

Lines changed: 0 additions & 6 deletions
This file was deleted.

local/docker-compose.yml

Lines changed: 28 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,28 @@
1+
version: "2"
2+
services:
3+
esearch:
4+
image: "elasticsearch:2.3"
5+
ports:
6+
- "9200:9200"
7+
zookeeper:
8+
image: confluent/zookeeper
9+
ports:
10+
- "2181:2181"
11+
environment:
12+
zk_id: "1"
13+
kafka:
14+
image: wurstmeister/kafka
15+
container_name: tc-projects-kafka
16+
depends_on:
17+
- zookeeper
18+
ports:
19+
- "9092:9092"
20+
environment:
21+
KAFKA_ADVERTISED_LISTENERS: PLAINTEXT://localhost:9092
22+
KAFKA_LISTENERS: PLAINTEXT://0.0.0.0:9092
23+
KAFKA_ZOOKEEPER_CONNECT: zookeeper:2181
24+
kafka-client:
25+
build: ./kafka-client
26+
depends_on:
27+
- kafka
28+
- zookeeper

local/kafka-client/Dockerfile

Lines changed: 5 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,5 @@
1+
From confluent/kafka
2+
WORKDIR /app/
3+
COPY topics.txt .
4+
COPY create-topics.sh .
5+
ENTRYPOINT ["/app/create-topics.sh"]

local/kafka-client/create-topics.sh

Lines changed: 5 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,5 @@
1+
#!/bin/bash
2+
3+
while read topic; do
4+
/usr/bin/kafka-topics --create --zookeeper zookeeper:2181 --partitions 1 --replication-factor 1 --topic $topic
5+
done < topics.txt

local/kafka-client/topics.txt

Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,3 @@
1+
project.action.create
2+
project.action.delete
3+
project.action.update

0 commit comments

Comments
 (0)