Skip to content

Improve local setup #19

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 5 commits into from
Jan 17, 2020
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
182 changes: 124 additions & 58 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -7,6 +7,122 @@
- ElasticSearch
- Docker, Docker Compose

## Local setup

1. Install node dependencies:

```bash
npm install
```

2. Run docker compose with dependant services:

```bash
cd local/
docker-compose up
```

<details><summary>Click to see details</summary>

This docker-compose run all the dependencies which are necessary for `project-processor-es` to work.

| Service | Name | Port |
|----------|:-----:|:----:|
| Elasticsearch | esearch | 9200 |
| Zookeeper | zookeeper | 2181 |
| Kafka | kafka | 9092 |

`docker-compose` automatically creates Kafka topics which are used by `project-processor-es` listed in `local/kafka-client/topics.txt`.

</details>


3. Set environment variables for M2M authentication: `AUTH0_CLIENT_ID`, `AUTH0_CLIENT_SECRET`, `AUTH0_URL`, `AUTH0_AUDIENCE`, `AUTH0_PROXY_SERVER_URL`:

```bash
export AUTH0_CLIENT_ID=<insert required value here>
export AUTH0_CLIENT_SECRET=<insert required value here>
export AUTH0_URL=<insert required value here>
export AUTH0_AUDIENCE=<insert required value here>
export AUTH0_PROXY_SERVER_URL=<insert required value here>
```

4. Initialize Elasticsearch indexes:

```bash
npm run sync:es
```

5. Start processor app:

```bash
npm start
```

## Commands

### Lint & Tests commands

| Command | Description |
|----------|--------------|
| `npm run lint` | Run lint check. |
| `npm run lin:fix` | Run lint check with automatic fixing of errors and warnings where possible. |
| `npm run test` | Run integration tests. |
| `npm run test:cov` | Run integration tests with coverage report. |

### View data in Elasticsearch indexes

You may run the next command to output documents in the Elasticsearch indexes for debugging purposes.

```bash
npm run view-data <INDEX_NAME> <DOCUMENT_ID>
```

##### Examples

- `npm run view-data projects 1` view document with id `1` in `projects` index
- `npm run view-data timelines 1` view document with id `1` in `timelines` index
- `npm run view-data metadata 1` view document with id `1` in `timelines` index *(this index has only one document and all the data is stored inside one document which might be very big)*.

### Kafka commands

If you've used `docker-compose` with the file `local/docker-compose.yml` during local setup to spawn kafka & zookeeper, you can use the following commands to manipulate kafka topics and messages:
(Replace `TOPIC_NAME` with the name of the desired topic)

#### Create Topic

```bash
docker exec project-processor-es-kafka /opt/kafka/bin/kafka-topics.sh --create --zookeeper zookeeper:2181 --partitions 1 --replication-factor 1 --topic TOPIC_NAME
```

#### List Topics

```bash
docker exec project-processor-es-kafka /opt/kafka/bin/kafka-topics.sh --list --zookeeper zookeeper:2181
```

#### Watch Topic

```bash
docker exec project-processor-es-kafka /opt/kafka/bin/kafka-console-consumer.sh --bootstrap-server localhost:9092 --topic TOPIC_NAME
```

#### Post Message to Topic (from stdin)

```bash
docker exec -it project-processor-es-kafka /opt/kafka/bin/kafka-console-producer.sh --broker-list localhost:9092 --topic TOPIC_NAME
```

- Enter or copy/paste the message into the console after starting this command.

#### Post Message to Topic (from file)

```bash
docker exec -i project-processor-es-kafka /opt/kafka/bin/kafka-console-producer.sh --topic project.action.create --broker-list localhost:9092 < test_message.json
```

- All example for messages are in: `./test/data`.

## Configuration

Configuration for the processor is at `config/default.js`.
Expand Down Expand Up @@ -40,56 +156,6 @@ Also note that there is a `/health` endpoint that checks for the health of the a

Config for tests are at `config/test.js`, it overrides some default config.

## Local Kafka setup
- Call extracted directory kafka_2.11-0.11.0.1 : `path_to_kafka`
- Call our project root directory : `our_project_root_directory`
- `http://kafka.apache.org/quickstart` contains details to setup and manage Kafka server,
below provides details to setup Kafka server in Mac, Windows will use bat commands in bin/windows instead
- Download kafka at `https://www.apache.org/dyn/closer.cgi?path=/kafka/1.1.0/kafka_2.11-1.1.0.tgz`
- Extract out the doanlowded tgz file
- Go to extracted directory kafka_2.11-0.11.0.1
- Start ZooKeeper server:
`bin/zookeeper-server-start.sh config/zookeeper.properties`
- Use another terminal, go to same directory, start the Kafka server:
`bin/kafka-server-start.sh config/server.properties`
- Note that the zookeeper server is at localhost:2181, and Kafka server is at localhost:9092
- Use another terminal, go to same directory, create some topics:
`bin/kafka-topics.sh --create --zookeeper localhost:2181 --replication-factor 1 --partitions 1 --topic project.action.create`
`bin/kafka-topics.sh --create --zookeeper localhost:2181 --replication-factor 1 --partitions 1 --topic project.action.update`
`bin/kafka-topics.sh --create --zookeeper localhost:2181 --replication-factor 1 --partitions 1 --topic project.action.delete`
- Verify that the topics are created:
`bin/kafka-topics.sh --list --zookeeper localhost:2181`,
it should list out the created topics
- run the producer and then write some message into the console to send to the `project.action.create` topic:
`bin/kafka-console-producer.sh --broker-list localhost:9092 --topic project.action.create`
in the console, write message, one message per line:
`{"topic":"project.action.create","originator":"project-api","timestamp":"2019-06-20T13:43:25.817Z","mime-type":"application/json","payload":{"resource":"project","createdAt":"2019-06-20T13:43:23.554Z","updatedAt":"2019-06-20T13:43:23.555Z","terms":[],"id":1,"name":"test project","description":"Hello I am a test project","type":"app","createdBy":40051333,"updatedBy":40051333,"projectEligibility":[],"bookmarks":[],"external":null,"status":"draft","lastActivityAt":"2019-06-20T13:43:23.514Z","lastActivityUserId":"40051333","members":[{"createdAt":"2019-06-20T13:43:23.555Z","updatedAt":"2019-06-20T13:43:23.625Z","id":2,"isPrimary":true,"role":"manager","userId":40051333,"updatedBy":40051333,"createdBy":40051333,"projectId":2,"deletedAt":null,"deletedBy":null}],"version":"v2","directProjectId":null,"billingAccountId":null,"estimatedPrice":null,"actualPrice":null,"details":null,"cancelReason":null,"templateId":null,"deletedBy":null,"attachments":null,"phases":null,"projectUrl":"https://connect.topcoder-dev.com/projects/2"}}`
- Optionally, use another terminal, go to same directory, start a consumer to view the messages:
`bin/kafka-console-consumer.sh --bootstrap-server localhost:9092 --topic project.action.create --from-beginning`
- If the kafka don't allow to input long message you can use this script to write message from file:
`path_to_kafka/bin/kafka-console-producer.sh --broker-list localhost:9092 --topic project.action.create < our_project_root_directory/test/data/project/project.action.create.json`
- Writing/reading messages to/from other topics are similar. All example for messages are in:
`our_project_root_directory/test/data`

## Local Elasticsearch setup

- In the `docker-es` folder, run `docker-compose up`

## Local deployment
- Install dependencies `npm i`
- Run code lint check `npm run lint`, running `npm run lint:fix` can fix some lint errors if any
- Initialize Elasticsearch, create configured Elasticsearch index if not present: `npm run sync:es`
- Start processor app `npm start`

Note that you need to set AUTH0 related environment variables belows before you can start the processor.

- AUTH0_URL
- AUTH0_AUDIENCE
- TOKEN_CACHE_TIME
- AUTH0_CLIENT_ID
- AUTH0_CLIENT_SECRET
- AUTH0_PROXY_SERVER_URL

## Local Deployment with Docker

To run the Challenge ES Processor using docker, follow the below steps
Expand Down Expand Up @@ -135,12 +201,10 @@ npm run test:cov
```

## Verification

- Call extracted directory kafka_2.11-0.11.0.1 : `path_to_kafka`
- Call our project root directory : `our_project_root_directory`
- Start kafka server, start elasticsearch, initialize Elasticsearch, start processor app
- Start Docker services, initialize Elasticsearch, start processor app
- Navigate to the repository root directory.
- Send message:
`path_to_kafka/bin/kafka-console-producer.sh --broker-list localhost:9092 --topic project.action.create < our_project_root_directory/test/data/project/project.action.create.json`
`docker exec -i project-processor-es-kafka /opt/kafka/bin/kafka-console-producer.sh --topic project.action.create --broker-list localhost:9092 < ./test/data/project/project.action.create.json`
- run command `npm run view-data projects 1` to view the created data, you will see the data are properly created:

```bash
Expand Down Expand Up @@ -193,7 +257,8 @@ info: {


- Run the producer and then write some invalid message into the console to send to the `project.action.create` topic:
`bin/kafka-console-producer.sh --broker-list localhost:9092 --topic project.action.create`

`docker exec -it project-processor-es-kafka /opt/kafka/bin/kafka-console-producer.sh --topic project.action.create --broker-list localhost:9092`
in the console, write message, one message per line:
`{ "topic": "project.action.create", "originator": "project-api", "timestamp": "2019-02-16T00:00:00", "mime-type": "application/json", "payload": { "id": "invalid", "typeId": "8e17090c-465b-4c17-b6d9-dfa16300b0ff", "track": "Code", "name": "test", "description": "desc", "timelineTemplateId": "8e17090c-465b-4c17-b6d9-dfa16300b0aa", "phases": [{ "id": "8e17090c-465b-4c17-b6d9-dfa16300b012", "name": "review", "isActive": true, "duration": 10000 }], "prizeSets": [{ "type": "prize", "prizes": [{ "type": "winning prize", "value": 500 }] }], "reviewType": "code review", "tags": ["code"], "projectId": 123, "forumId": 456, "status": "Active", "created": "2019-02-16T00:00:00", "createdBy": "admin" } }`

Expand All @@ -203,7 +268,8 @@ info: {
- Then in the app console, you will see error messages

- Sent message to update data:
`path_to_kafka/bin/kafka-console-producer.sh --broker-list localhost:9092 --topic project.action.update < our_project_root_directory/test/data/project/project.action.update.json`

`docker exec -i project-processor-es-kafka /opt/kafka/bin/kafka-console-producer.sh --topic project.action.update --broker-list localhost:9092 < ./test/data/project/project.action.update.json`
- Run command `npm run view-data projects 1` to view the updated data, you will see the data are properly updated:

```bash
Expand Down Expand Up @@ -258,7 +324,7 @@ info: {


- Run the producer and then write some invalid message into the console to send to the `project.action.create` topic:
`bin/kafka-console-producer.sh --broker-list localhost:9092 --topic project.action.create`
`docker exec -it project-processor-es-kafka /opt/kafka/bin/kafka-console-producer.sh --broker-list localhost:9092 --topic project.action.create`
in the console, write message, one message per line:
`{ "topic": "project.action.update", "originator": "project-api", "timestamp": "2019-02-17T01:00:00", "mime-type": "application/json", "payload": { "id": "173803d3-019e-4033-b1cf-d7205c7f774c", "typeId": "123", "track": "Code", "name": "test3", "description": "desc3", "timelineTemplateId": "8e17090c-465b-4c17-b6d9-dfa16300b0dd", "groups": ["group2", "group3"], "updated": "2019-02-17T01:00:00", "updatedBy": "admin" } }`

Expand Down
6 changes: 0 additions & 6 deletions docker-es/docker-compose.yml

This file was deleted.

28 changes: 28 additions & 0 deletions local/docker-compose.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,28 @@
version: "2"
services:
esearch:
image: "elasticsearch:2.3"
ports:
- "9200:9200"
zookeeper:
image: wurstmeister/zookeeper
ports:
- "2181:2181"
environment:
zk_id: "1"
kafka:
image: wurstmeister/kafka
container_name: project-processor-es-kafka
depends_on:
- zookeeper
ports:
- "9092:9092"
environment:
KAFKA_ADVERTISED_LISTENERS: PLAINTEXT://localhost:9092
KAFKA_LISTENERS: PLAINTEXT://0.0.0.0:9092
KAFKA_ZOOKEEPER_CONNECT: zookeeper:2181
kafka-client:
build: ./kafka-client
depends_on:
- kafka
- zookeeper
5 changes: 5 additions & 0 deletions local/kafka-client/Dockerfile
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
From wurstmeister/kafka
WORKDIR /app/
COPY topics.txt .
COPY create-topics.sh .
ENTRYPOINT ["/app/create-topics.sh"]
5 changes: 5 additions & 0 deletions local/kafka-client/create-topics.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
#!/bin/bash

while read topic; do
/opt/kafka/bin/kafka-topics.sh --create --zookeeper zookeeper:2181 --partitions 1 --replication-factor 1 --topic $topic
done < topics.txt
3 changes: 3 additions & 0 deletions local/kafka-client/topics.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1,3 @@
project.action.create
project.action.delete
project.action.update