Skip to content
This repository was archived by the owner on Mar 13, 2025. It is now read-only.

Commit 6541bc3

Browse files
Merge pull request #68 from topcoder-platform/pgsql
Replace QLDB with PgSQL
2 parents 927caa7 + 2bf338f commit 6541bc3

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

73 files changed

+4495
-1896
lines changed

README.md

Lines changed: 33 additions & 142 deletions
Original file line numberDiff line numberDiff line change
@@ -2,26 +2,29 @@
22

33
## Install software
44

5-
- node 12.x
6-
- npm 6.x
5+
- node 12.x+
6+
- npm 6.x+
77
- docker
8-
- elasticsearch 7.7
8+
- elasticsearch 7.7+
9+
- PostgreSQL
910

1011
## Configuration
1112

1213
Configuration for the application is at config/default.js and config/production.js. The following parameters can be set in config files or in env variables:
1314

1415
- LOG_LEVEL: the log level
1516
- PORT: the server port
17+
- CASCADE_PAUSE_MS: how many milliseconds to pause between deleting records during cascade delete (default to 1000)
1618
- AUTH_SECRET: TC Authentication secret
1719
- VALID_ISSUERS: valid issuers for TC authentication
1820
- PAGE_SIZE: the default pagination limit
1921
- MAX_PAGE_SIZE: the maximum pagination size
2022
- API_VERSION: the API version
21-
- AWS_ACCESS_KEY_ID: The AWS access key
22-
- AWS_SECRET_ACCESS_KEY: The AWS secret key
23-
- AWS_REGION: The Amazon region to use when connecting.
24-
- DATABASE: The QLDB ledger name
23+
- DB_NAME: the database name
24+
- DB_USERNAME: the database username
25+
- DB_PASSWORD: the database password
26+
- DB_HOST: the database port
27+
- DB_PORT: the database port
2528
- AUTH0_URL: Auth0 URL, used to get TC M2M token
2629
- AUTH0_AUDIENCE: Auth0 audience, used to get TC M2M token
2730
- TOKEN_CACHE_TIME: Auth0 token cache time, used to get TC M2M token
@@ -57,26 +60,38 @@ For `ES.DOCUMENTS` configuration, you will find multiple other configurations be
5760

5861
Setup your Elasticsearch instance and ensure that it is up and running.
5962

60-
1. Visit [this link](https://console.aws.amazon.com/qldb/home?region=us-east-1#gettingStarted), login and create one **ledger** databases named `ubahn-db`
61-
2. Visit [this link](https://console.aws.amazon.com/iam/home?region=us-east-1#/security_credentials) to download your "Access keys"
62-
3. Follow *Configuration* section to update config values, like database, aws key/secret etc ..
63-
4. Goto *UBahn-api*, run `npm i` and `npm run lint`
64-
5. Import mock data, `node scripts/db/genData.js`, this will create tables and gen some data for test (if you need this)
63+
1. Follow *Configuration* section to update config values, like database, etc ..
64+
2. Goto *UBahn-api*, run `npm i` and `npm run lint`
65+
3. Run the migrations - `npm run migrations up`. This will create the database, the tables.
66+
4. Then run `npm run insert-data` and insert mock data into the database.
67+
5. You will then run `npm run migrate-db-to-es` to migrate the data to elasticsearch from the database
6568
6. Startup server `node app.js` or `npm run start`
6669

67-
## Working with mock data
70+
## Migrations
6871

69-
You can use the scripts `npm run insert-data` (and `npm run delete-data`) to insert mock data (and delete mock data respectively). The data is inserted into QLDB and Elasticsearch. You need to setup the configurations beforehand and also start the elasticsearch instance before you run these scripts
72+
Migrations are located under the `./scripts/db/` folder. Run `npm run migrations up` and `npm run migrations down` to execute the migrations or remove the earlier ones
73+
74+
## Import data from QLDB
75+
76+
Make sure `QLDB_NAME`, `AWS_REGION`, `AWS_ACCESS_KEY_ID`, `AWS_SECRET_ACCESS_KEY` are set in your environment
77+
78+
Run `npm run migrate-qldb-to-pg` to import data from qldb.
79+
80+
## Import data from S3 to QLDB
81+
82+
Make sure `BUCKET_NAME`, `QLDB_NAME`, `AWS_REGION`, `AWS_ACCESS_KEY_ID`, `AWS_SECRET_ACCESS_KEY` are set in your environment
83+
84+
Run `npm run import-s3-data` to import data from s3 to qldb.
7085

7186
## Local Deployment with Docker
7287

73-
Make sure all config values are right(aws key and secret), and you can run on local successful, then run below commands
88+
Make sure all config values are right, and you can run on local successfully, then run below commands
7489

7590
1. Navigate to the directory `docker`
7691

77-
2. Rename the file `sample.api.env` to `api.env`
92+
2. Rename the file `sample.env` to `.env`
7893

79-
3. Set the required AUTH0 configurations, AWS credentials and ElasticSearch host in the file `api.env`
94+
3. Set the required AUTH0 configurations, DB configurations and ElasticSearch host in the file `.env`
8095

8196
4. Once that is done, run the following command
8297

@@ -86,128 +101,4 @@ Make sure all config values are right(aws key and secret), and you can run on lo
86101

87102
5. When you are running the application for the first time, It will take some time initially to download the image and install the dependencies
88103

89-
## API endpoints verification
90-
91-
1. open postman
92-
2. import *docs/UBahn_API.postman_collection.json* , *UBahn_ENV.postman_environment.json* and then check endpoints
93-
94-
## Test token
95-
96-
you can use below token to test role and permissions
97-
98-
### 01 Topcoder User
99-
100-
- payload
101-
102-
```json
103-
{
104-
"roles": [
105-
"Topcoder User"
106-
],
107-
"iss": "https://api.topcoder.com",
108-
"handle": "tc-user",
109-
"exp": 1685571460,
110-
"userId": "23166766",
111-
"iat": 1585570860,
112-
"email": "tc-user@gmail.com",
113-
"jti": "0f1ef1d3-2b33-4900-bb43-48f2285f9627"
114-
}
115-
```
116-
117-
- token
118-
119-
```text
120-
eyJ0eXAiOiJKV1QiLCJhbGciOiJIUzI1NiJ9.eyJyb2xlcyI6WyJUb3Bjb2RlciBVc2VyIl0sImlzcyI6Imh0dHBzOi8vYXBpLnRvcGNvZGVyLmNvbSIsImhhbmRsZSI6InRjLXVzZXIiLCJleHAiOjE2ODU1NzE0NjAsInVzZXJJZCI6IjIzMTY2NzY2IiwiaWF0IjoxNTg1NTcwODYwLCJlbWFpbCI6InRjLXVzZXJAZ21haWwuY29tIiwianRpIjoiMGYxZWYxZDMtMmIzMy00OTAwLWJiNDMtNDhmMjI4NWY5NjI3In0.eBhXqSBe8zMRg2nBeGeZDgKiJdAYs0zOMzGfJCjWfcs
121-
```
122-
123-
#### 02 Copilot
124-
125-
- payload
126-
127-
```json
128-
{
129-
"roles": [
130-
"Topcoder User","Copilot"
131-
],
132-
"iss": "https://api.topcoder.com",
133-
"handle": "tc-Copilot",
134-
"exp": 1685571460,
135-
"userId": "23166767",
136-
"iat": 1585570860,
137-
"email": "tc-Copilot@gmail.com",
138-
"jti": "0f1ef1d3-2b33-4900-bb43-48f2285f9628"
139-
}
140-
```
141-
142-
- token
143-
144-
```json
145-
eyJ0eXAiOiJKV1QiLCJhbGciOiJIUzI1NiJ9.eyJyb2xlcyI6WyJUb3Bjb2RlciBVc2VyIiwiQ29waWxvdCJdLCJpc3MiOiJodHRwczovL2FwaS50b3Bjb2Rlci5jb20iLCJoYW5kbGUiOiJ0Yy1Db3BpbG90IiwiZXhwIjoxNjg1NTcxNDYwLCJ1c2VySWQiOiIyMzE2Njc2NyIsImlhdCI6MTU4NTU3MDg2MCwiZW1haWwiOiJ0Yy1Db3BpbG90QGdtYWlsLmNvbSIsImp0aSI6IjBmMWVmMWQzLTJiMzMtNDkwMC1iYjQzLTQ4ZjIyODVmOTYyOCJ9.gP5JqJGCnOjO_gYs2r3-AQt5x8YIym15m3t43603cgc
146-
```
147-
148-
#### 03 Admin
149-
150-
- payload
151-
152-
```json
153-
{
154-
"roles": [
155-
"Topcoder User","Copilot","Admin"
156-
],
157-
"iss": "https://api.topcoder.com",
158-
"handle": "tc-Admin",
159-
"exp": 1685571460,
160-
"userId": "23166768",
161-
"iat": 1585570860,
162-
"email": "tc-Admin@gmail.com",
163-
"jti": "0f1ef1d3-2b33-4900-bb43-48f2285f9630"
164-
}
165-
```
166-
167-
- token
168-
169-
```json
170-
eyJ0eXAiOiJKV1QiLCJhbGciOiJIUzI1NiJ9.eyJyb2xlcyI6WyJUb3Bjb2RlciBVc2VyIiwiQ29waWxvdCIsIkFkbWluIl0sImlzcyI6Imh0dHBzOi8vYXBpLnRvcGNvZGVyLmNvbSIsImhhbmRsZSI6InRjLUFkbWluIiwiZXhwIjoxNjg1NTcxNDYwLCJ1c2VySWQiOiIyMzE2Njc2OCIsImlhdCI6MTU4NTU3MDg2MCwiZW1haWwiOiJ0Yy1BZG1pbkBnbWFpbC5jb20iLCJqdGkiOiIwZjFlZjFkMy0yYjMzLTQ5MDAtYmI0My00OGYyMjg1Zjk2MzAifQ.eR97kePT0Gu-t7vUE0Ed8A88Dnmtgebyml2jrRyxhOk
171-
```
172-
173-
#### M2M token 01
174-
175-
- payload, this token missing `all:usersSkill`, so all endpoints in usersSkill group will return 403
176-
177-
```json
178-
{
179-
"scopes": "all:user all:role all:skill all:usersRole all:organization all:skillsProvider",
180-
"iss": "https://api.topcoder.com",
181-
"handle":"tc-mm-01",
182-
"exp": 1685571460,
183-
"iat": 1585570860,
184-
"jti": "0f1ef1d3-2b33-4900-bb43-48f2285f9630"
185-
}
186-
```
187-
188-
- token
189-
190-
```json
191-
eyJ0eXAiOiJKV1QiLCJhbGciOiJIUzI1NiJ9.eyJzY29wZXMiOiJhbGw6dXNlciBhbGw6cm9sZSBhbGw6c2tpbGwgYWxsOnVzZXJzUm9sZSBhbGw6b3JnYW5pemF0aW9uIGFsbDpza2lsbHNQcm92aWRlciIsImlzcyI6Imh0dHBzOi8vYXBpLnRvcGNvZGVyLmNvbSIsImhhbmRsZSI6InRjLW1tLTAxIiwiZXhwIjoxNjg1NTcxNDYwLCJpYXQiOjE1ODU1NzA4NjAsImp0aSI6IjBmMWVmMWQzLTJiMzMtNDkwMC1iYjQzLTQ4ZjIyODVmOTYzMCJ9.BlDIYsCTcHTib9XhpyzpO-KkMTTMy0egq_7qlLWRmoM
192-
```
193-
194-
#### M2M token 02
195-
196-
- payload, this token contains scope, can request all endpoints
197-
198-
```json
199-
{
200-
"scopes": "all:user all:role all:skill all:usersRole all:organization all:skillsProvider all:usersSkill all:externalProfile all:achievementsProvider all:achievement all:attributeGroup all:attribute all:userAttribute",
201-
"iss": "https://api.topcoder.com",
202-
"handle": "tc-mm-02",
203-
"exp": 1685571460,
204-
"iat": 1585570860,
205-
"jti": "0f1ef1d3-2b33-4900-bb43-48f2285f9630"
206-
}
207-
```
208-
209-
- token
210-
211-
```json
212-
eyJ0eXAiOiJKV1QiLCJhbGciOiJIUzI1NiJ9.eyJzY29wZXMiOiJhbGw6dXNlciBhbGw6cm9sZSBhbGw6c2tpbGwgYWxsOnVzZXJzUm9sZSBhbGw6b3JnYW5pemF0aW9uIGFsbDpza2lsbHNQcm92aWRlciBhbGw6dXNlcnNTa2lsbCBhbGw6ZXh0ZXJuYWxQcm9maWxlIGFsbDphY2hpZXZlbWVudHNQcm92aWRlciBhbGw6YWNoaWV2ZW1lbnQgYWxsOmF0dHJpYnV0ZUdyb3VwIGFsbDphdHRyaWJ1dGUgYWxsOnVzZXJBdHRyaWJ1dGUiLCJpc3MiOiJodHRwczovL2FwaS50b3Bjb2Rlci5jb20iLCJoYW5kbGUiOiJ0Yy1tbS0wMiIsImV4cCI6MTY4NTU3MTQ2MCwiaWF0IjoxNTg1NTcwODYwLCJqdGkiOiIwZjFlZjFkMy0yYjMzLTQ5MDAtYmI0My00OGYyMjg1Zjk2MzAifQ.8XJahLdv9mkgkL7EsOwsf8uKg4J9u-1UM73pvZ9n3JY
213-
```
104+
You can also head into `docker-pgsql-es` folder and run `docker-compose up -d` to have docker instances of pgsql and elasticsearch to use with the api

VERIFICATION.md

Lines changed: 0 additions & 102 deletions
This file was deleted.

config/default.js

Lines changed: 8 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -4,7 +4,9 @@
44

55
module.exports = {
66
LOG_LEVEL: process.env.LOG_LEVEL || 'debug',
7-
PORT: process.env.PORT || 3001,
7+
PORT: process.env.PORT || 3002,
8+
9+
CASCADE_PAUSE_MS: process.env.CASCADE_PAUSE_MS || 1000,
810

911
AUTH_SECRET: process.env.AUTH_SECRET || 'CLIENT_SECRET',
1012
VALID_ISSUERS: process.env.VALID_ISSUERS ? process.env.VALID_ISSUERS.replace(/\\"/g, '')
@@ -14,10 +16,11 @@ module.exports = {
1416
MAX_PAGE_SIZE: parseInt(process.env.MAX_PAGE_SIZE) || 100,
1517
API_VERSION: process.env.API_VERSION || 'api/1.0',
1618

17-
AWS_ACCESS_KEY_ID: process.env.AWS_ACCESS_KEY_ID,
18-
AWS_SECRET_ACCESS_KEY: process.env.AWS_SECRET_ACCESS_KEY,
19-
AWS_REGION: process.env.AWS_REGION || 'us-east-1',
20-
DATABASE: process.env.DATABASE || 'ubahn-db',
19+
DB_NAME: process.env.DB_NAME || 'ubahn-db',
20+
DB_USERNAME: process.env.DB_USER || 'postgres',
21+
DB_PASSWORD: process.env.DB_PASSWORD || 'password',
22+
DB_HOST: process.env.DB_HOST || 'localhost',
23+
DB_PORT: process.env.DB_PORT || 5432,
2124

2225
AUTH0_URL: process.env.AUTH0_URL,
2326
AUTH0_AUDIENCE: process.env.AUTH0_AUDIENCE,

docker-pgsql-es/docker-compose.yml

Lines changed: 21 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,21 @@
1+
version: '3'
2+
services:
3+
postgres:
4+
image: "postgres:12.4"
5+
volumes:
6+
- database-data:/var/lib/postgresql/data/
7+
ports:
8+
- "5432:5432"
9+
environment:
10+
POSTGRES_PASSWORD: ${DB_PASSWORD}
11+
POSTGRES_USER: ${DB_USERNAME}
12+
POSTGRES_DB: ${DB_NAME}
13+
esearch:
14+
image: elasticsearch:7.7.1
15+
container_name: ubahn-data-processor-es_es
16+
ports:
17+
- "9200:9200"
18+
environment:
19+
- discovery.type=single-node
20+
volumes:
21+
database-data:

docker-pgsql-es/sample.env

Lines changed: 5 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,5 @@
1+
DB_NAME=ubahn-db
2+
DB_USERNAME=postgres
3+
DB_PASSWORD=<password>
4+
DB_HOST=postgres
5+
DB_PORT=5432

docker/Dockerfile

Lines changed: 5 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -9,4 +9,9 @@ WORKDIR /ubahn_api
99

1010
# Install the dependencies from package.json
1111
RUN npm install
12+
13+
# Expose port
14+
EXPOSE 3002
15+
16+
# start api
1217
CMD npm start

docker/sample.api.env renamed to docker/sample.env

Lines changed: 6 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,9 @@
1-
AWS_ACCESS_KEY_ID=<AWS Access Key ID>
2-
AWS_SECRET_ACCESS_KEY=<AWS Secret Access Key>
1+
DB_NAME=ubahn-db
2+
DB_USERNAME=postgres
3+
DB_PASSWORD=<password>
4+
DB_HOST=postgres
5+
DB_PORT=5432
6+
37
ES_HOST=<ES Host Endpoint>
48

59
AUTH0_URL=<AUTH0 URL>

0 commit comments

Comments
 (0)