Skip to content
This repository was archived by the owner on Mar 13, 2025. It is now read-only.

Commit ebdd993

Browse files
merge winning submission of qldb to pgsql contest
1 parent c1ef242 commit ebdd993

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

53 files changed

+6912
-1362
lines changed

README.md

Lines changed: 20 additions & 141 deletions
Original file line numberDiff line numberDiff line change
@@ -2,26 +2,29 @@
22

33
## Install software
44

5-
- node 12.x
6-
- npm 6.x
5+
- node 12.x+
6+
- npm 6.x+
77
- docker
8-
- elasticsearch 7.7
8+
- elasticsearch 7.7+
9+
- PostgreSQL
910

1011
## Configuration
1112

1213
Configuration for the application is at config/default.js and config/production.js. The following parameters can be set in config files or in env variables:
1314

1415
- LOG_LEVEL: the log level
1516
- PORT: the server port
17+
- CASCADE_PAUSE_MS: how many milliseconds to pause between deleting records during cascade delete (default to 1000)
1618
- AUTH_SECRET: TC Authentication secret
1719
- VALID_ISSUERS: valid issuers for TC authentication
1820
- PAGE_SIZE: the default pagination limit
1921
- MAX_PAGE_SIZE: the maximum pagination size
2022
- API_VERSION: the API version
21-
- AWS_ACCESS_KEY_ID: The AWS access key
22-
- AWS_SECRET_ACCESS_KEY: The AWS secret key
23-
- AWS_REGION: The Amazon region to use when connecting.
24-
- DATABASE: The QLDB ledger name
23+
- DB_NAME: the database name
24+
- DB_USERNAME: the database username
25+
- DB_PASSWORD: the database password
26+
- DB_HOST: the database port
27+
- DB_PORT: the database port
2528
- AUTH0_URL: Auth0 URL, used to get TC M2M token
2629
- AUTH0_AUDIENCE: Auth0 audience, used to get TC M2M token
2730
- TOKEN_CACHE_TIME: Auth0 token cache time, used to get TC M2M token
@@ -57,26 +60,24 @@ For `ES.DOCUMENTS` configuration, you will find multiple other configurations be
5760

5861
Setup your Elasticsearch instance and ensure that it is up and running.
5962

60-
1. Visit [this link](https://console.aws.amazon.com/qldb/home?region=us-east-1#gettingStarted), login and create one **ledger** databases named `ubahn-db`
61-
2. Visit [this link](https://console.aws.amazon.com/iam/home?region=us-east-1#/security_credentials) to download your "Access keys"
62-
3. Follow *Configuration* section to update config values, like database, aws key/secret etc ..
63-
4. Goto *UBahn-api*, run `npm i` and `npm run lint`
64-
5. Import mock data, `node scripts/db/genData.js`, this will create tables and gen some data for test (if you need this)
65-
6. Startup server `node app.js` or `npm run start`
63+
1. Follow *Configuration* section to update config values, like database, etc ..
64+
2. Goto *UBahn-api*, run `npm i` and `npm run lint`
65+
3. Import mock data, `node scripts/db/genData.js`, this will gen some data for test (if you need this)
66+
4. Startup server `node app.js` or `npm run start`
6667

6768
## Working with mock data
6869

69-
You can use the scripts `npm run insert-data` (and `npm run delete-data`) to insert mock data (and delete mock data respectively). The data is inserted into QLDB and Elasticsearch. You need to setup the configurations beforehand and also start the elasticsearch instance before you run these scripts
70+
You can use the scripts `npm run insert-data` (and `npm run delete-data`) to insert mock data (and delete mock data respectively). The data is inserted into Postgres and Elasticsearch. You need to setup the configurations beforehand and also start the elasticsearch instance before you run these scripts. You can run the script `npm run migrate-db-to-es` to dump data in db into es.
7071

7172
## Local Deployment with Docker
7273

73-
Make sure all config values are right(aws key and secret), and you can run on local successful, then run below commands
74+
Make sure all config values are right, and you can run on local successfully, then run below commands
7475

7576
1. Navigate to the directory `docker`
7677

77-
2. Rename the file `sample.api.env` to `api.env`
78+
2. Rename the file `sample.env` to `.env`
7879

79-
3. Set the required AUTH0 configurations, AWS credentials and ElasticSearch host in the file `api.env`
80+
3. Set the required AUTH0 configurations, DB configurations and ElasticSearch host in the file `.env`
8081

8182
4. Once that is done, run the following command
8283

@@ -86,128 +87,6 @@ Make sure all config values are right(aws key and secret), and you can run on lo
8687

8788
5. When you are running the application for the first time, It will take some time initially to download the image and install the dependencies
8889

89-
## API endpoints verification
90-
91-
1. open postman
92-
2. import *docs/UBahn_API.postman_collection.json* , *UBahn_ENV.postman_environment.json* and then check endpoints
93-
94-
## Test token
95-
96-
you can use below token to test role and permissions
90+
## Verification
9791

98-
### 01 Topcoder User
99-
100-
- payload
101-
102-
```json
103-
{
104-
"roles": [
105-
"Topcoder User"
106-
],
107-
"iss": "https://api.topcoder.com",
108-
"handle": "tc-user",
109-
"exp": 1685571460,
110-
"userId": "23166766",
111-
"iat": 1585570860,
112-
"email": "tc-user@gmail.com",
113-
"jti": "0f1ef1d3-2b33-4900-bb43-48f2285f9627"
114-
}
115-
```
116-
117-
- token
118-
119-
```text
120-
eyJ0eXAiOiJKV1QiLCJhbGciOiJIUzI1NiJ9.eyJyb2xlcyI6WyJUb3Bjb2RlciBVc2VyIl0sImlzcyI6Imh0dHBzOi8vYXBpLnRvcGNvZGVyLmNvbSIsImhhbmRsZSI6InRjLXVzZXIiLCJleHAiOjE2ODU1NzE0NjAsInVzZXJJZCI6IjIzMTY2NzY2IiwiaWF0IjoxNTg1NTcwODYwLCJlbWFpbCI6InRjLXVzZXJAZ21haWwuY29tIiwianRpIjoiMGYxZWYxZDMtMmIzMy00OTAwLWJiNDMtNDhmMjI4NWY5NjI3In0.eBhXqSBe8zMRg2nBeGeZDgKiJdAYs0zOMzGfJCjWfcs
121-
```
122-
123-
#### 02 Copilot
124-
125-
- payload
126-
127-
```json
128-
{
129-
"roles": [
130-
"Topcoder User","Copilot"
131-
],
132-
"iss": "https://api.topcoder.com",
133-
"handle": "tc-Copilot",
134-
"exp": 1685571460,
135-
"userId": "23166767",
136-
"iat": 1585570860,
137-
"email": "tc-Copilot@gmail.com",
138-
"jti": "0f1ef1d3-2b33-4900-bb43-48f2285f9628"
139-
}
140-
```
141-
142-
- token
143-
144-
```json
145-
eyJ0eXAiOiJKV1QiLCJhbGciOiJIUzI1NiJ9.eyJyb2xlcyI6WyJUb3Bjb2RlciBVc2VyIiwiQ29waWxvdCJdLCJpc3MiOiJodHRwczovL2FwaS50b3Bjb2Rlci5jb20iLCJoYW5kbGUiOiJ0Yy1Db3BpbG90IiwiZXhwIjoxNjg1NTcxNDYwLCJ1c2VySWQiOiIyMzE2Njc2NyIsImlhdCI6MTU4NTU3MDg2MCwiZW1haWwiOiJ0Yy1Db3BpbG90QGdtYWlsLmNvbSIsImp0aSI6IjBmMWVmMWQzLTJiMzMtNDkwMC1iYjQzLTQ4ZjIyODVmOTYyOCJ9.gP5JqJGCnOjO_gYs2r3-AQt5x8YIym15m3t43603cgc
146-
```
147-
148-
#### 03 Admin
149-
150-
- payload
151-
152-
```json
153-
{
154-
"roles": [
155-
"Topcoder User","Copilot","Admin"
156-
],
157-
"iss": "https://api.topcoder.com",
158-
"handle": "tc-Admin",
159-
"exp": 1685571460,
160-
"userId": "23166768",
161-
"iat": 1585570860,
162-
"email": "tc-Admin@gmail.com",
163-
"jti": "0f1ef1d3-2b33-4900-bb43-48f2285f9630"
164-
}
165-
```
166-
167-
- token
168-
169-
```json
170-
eyJ0eXAiOiJKV1QiLCJhbGciOiJIUzI1NiJ9.eyJyb2xlcyI6WyJUb3Bjb2RlciBVc2VyIiwiQ29waWxvdCIsIkFkbWluIl0sImlzcyI6Imh0dHBzOi8vYXBpLnRvcGNvZGVyLmNvbSIsImhhbmRsZSI6InRjLUFkbWluIiwiZXhwIjoxNjg1NTcxNDYwLCJ1c2VySWQiOiIyMzE2Njc2OCIsImlhdCI6MTU4NTU3MDg2MCwiZW1haWwiOiJ0Yy1BZG1pbkBnbWFpbC5jb20iLCJqdGkiOiIwZjFlZjFkMy0yYjMzLTQ5MDAtYmI0My00OGYyMjg1Zjk2MzAifQ.eR97kePT0Gu-t7vUE0Ed8A88Dnmtgebyml2jrRyxhOk
171-
```
172-
173-
#### M2M token 01
174-
175-
- payload, this token missing `all:usersSkill`, so all endpoints in usersSkill group will return 403
176-
177-
```json
178-
{
179-
"scopes": "all:user all:role all:skill all:usersRole all:organization all:skillsProvider",
180-
"iss": "https://api.topcoder.com",
181-
"handle":"tc-mm-01",
182-
"exp": 1685571460,
183-
"iat": 1585570860,
184-
"jti": "0f1ef1d3-2b33-4900-bb43-48f2285f9630"
185-
}
186-
```
187-
188-
- token
189-
190-
```json
191-
eyJ0eXAiOiJKV1QiLCJhbGciOiJIUzI1NiJ9.eyJzY29wZXMiOiJhbGw6dXNlciBhbGw6cm9sZSBhbGw6c2tpbGwgYWxsOnVzZXJzUm9sZSBhbGw6b3JnYW5pemF0aW9uIGFsbDpza2lsbHNQcm92aWRlciIsImlzcyI6Imh0dHBzOi8vYXBpLnRvcGNvZGVyLmNvbSIsImhhbmRsZSI6InRjLW1tLTAxIiwiZXhwIjoxNjg1NTcxNDYwLCJpYXQiOjE1ODU1NzA4NjAsImp0aSI6IjBmMWVmMWQzLTJiMzMtNDkwMC1iYjQzLTQ4ZjIyODVmOTYzMCJ9.BlDIYsCTcHTib9XhpyzpO-KkMTTMy0egq_7qlLWRmoM
192-
```
193-
194-
#### M2M token 02
195-
196-
- payload, this token contains scope, can request all endpoints
197-
198-
```json
199-
{
200-
"scopes": "all:user all:role all:skill all:usersRole all:organization all:skillsProvider all:usersSkill all:externalProfile all:achievementsProvider all:achievement all:attributeGroup all:attribute all:userAttribute",
201-
"iss": "https://api.topcoder.com",
202-
"handle": "tc-mm-02",
203-
"exp": 1685571460,
204-
"iat": 1585570860,
205-
"jti": "0f1ef1d3-2b33-4900-bb43-48f2285f9630"
206-
}
207-
```
208-
209-
- token
210-
211-
```json
212-
eyJ0eXAiOiJKV1QiLCJhbGciOiJIUzI1NiJ9.eyJzY29wZXMiOiJhbGw6dXNlciBhbGw6cm9sZSBhbGw6c2tpbGwgYWxsOnVzZXJzUm9sZSBhbGw6b3JnYW5pemF0aW9uIGFsbDpza2lsbHNQcm92aWRlciBhbGw6dXNlcnNTa2lsbCBhbGw6ZXh0ZXJuYWxQcm9maWxlIGFsbDphY2hpZXZlbWVudHNQcm92aWRlciBhbGw6YWNoaWV2ZW1lbnQgYWxsOmF0dHJpYnV0ZUdyb3VwIGFsbDphdHRyaWJ1dGUgYWxsOnVzZXJBdHRyaWJ1dGUiLCJpc3MiOiJodHRwczovL2FwaS50b3Bjb2Rlci5jb20iLCJoYW5kbGUiOiJ0Yy1tbS0wMiIsImV4cCI6MTY4NTU3MTQ2MCwiaWF0IjoxNTg1NTcwODYwLCJqdGkiOiIwZjFlZjFkMy0yYjMzLTQ5MDAtYmI0My00OGYyMjg1Zjk2MzAifQ.8XJahLdv9mkgkL7EsOwsf8uKg4J9u-1UM73pvZ9n3JY
213-
```
92+
See `Verification.md`

VERIFICATION.md

Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1,3 +1,6 @@
1+
<!-- TODO - Update BEFORE merge -->
2+
3+
14
## Verification
25
The verification needs data in ES, there are two ways to populate data to ES.
36

config/default.js

Lines changed: 8 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -4,7 +4,9 @@
44

55
module.exports = {
66
LOG_LEVEL: process.env.LOG_LEVEL || 'debug',
7-
PORT: process.env.PORT || 3001,
7+
PORT: process.env.PORT || 3002,
8+
9+
CASCADE_PAUSE_MS: process.env.CASCADE_PAUSE_MS || 1000,
810

911
AUTH_SECRET: process.env.AUTH_SECRET || 'CLIENT_SECRET',
1012
VALID_ISSUERS: process.env.VALID_ISSUERS ? process.env.VALID_ISSUERS.replace(/\\"/g, '')
@@ -14,10 +16,11 @@ module.exports = {
1416
MAX_PAGE_SIZE: parseInt(process.env.MAX_PAGE_SIZE) || 100,
1517
API_VERSION: process.env.API_VERSION || 'api/1.0',
1618

17-
AWS_ACCESS_KEY_ID: process.env.AWS_ACCESS_KEY_ID,
18-
AWS_SECRET_ACCESS_KEY: process.env.AWS_SECRET_ACCESS_KEY,
19-
AWS_REGION: process.env.AWS_REGION || 'us-east-1',
20-
DATABASE: process.env.DATABASE || 'ubahn-db',
19+
DB_NAME: process.env.DB_NAME || 'ubahn-db',
20+
DB_USERNAME: process.env.DB_USER || 'postgres',
21+
DB_PASSWORD: process.env.DB_PASSWORD || 'password',
22+
DB_HOST: process.env.DB_HOST || 'localhost',
23+
DB_PORT: process.env.DB_PORT || 5432,
2124

2225
AUTH0_URL: process.env.AUTH0_URL,
2326
AUTH0_AUDIENCE: process.env.AUTH0_AUDIENCE,

docker/Dockerfile

Lines changed: 5 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -9,4 +9,9 @@ WORKDIR /ubahn_api
99

1010
# Install the dependencies from package.json
1111
RUN npm install
12+
13+
# Expose port
14+
EXPOSE 3002
15+
16+
# start api
1217
CMD npm start

docker/docker-compose.yml

Lines changed: 20 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -1,11 +1,23 @@
1+
# TODO - Update BEFORE merge
2+
13
version: '3'
24
services:
3-
ubahn_api:
4-
image: ubahn_api:latest
5-
build:
6-
context: ../
7-
dockerfile: docker/Dockerfile
8-
env_file:
9-
- api.env
5+
postgres:
6+
image: "postgres:13.1"
7+
volumes:
8+
- database-data:/var/lib/postgresql/data/
9+
ports:
10+
- "5432:5432"
11+
environment:
12+
POSTGRES_PASSWORD: ${DB_PASSWORD}
13+
POSTGRES_USER: ${DB_USERNAME}
14+
POSTGRES_DB: ${DB_NAME}
15+
esearch:
16+
image: elasticsearch:7.7.1
17+
container_name: ubahn-data-processor-es_es
1018
ports:
11-
- "3001:3001"
19+
- "9200:9200"
20+
environment:
21+
- discovery.type=single-node
22+
volumes:
23+
database-data:

docker/sample.api.env renamed to docker/sample.env

Lines changed: 6 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,9 @@
1-
AWS_ACCESS_KEY_ID=<AWS Access Key ID>
2-
AWS_SECRET_ACCESS_KEY=<AWS Secret Access Key>
1+
DB_NAME=ubahn-db
2+
DB_USERNAME=postgres
3+
DB_PASSWORD=<password>
4+
DB_HOST=postgres
5+
DB_PORT=5432
6+
37
ES_HOST=<ES Host Endpoint>
48

59
AUTH0_URL=<AUTH0 URL>

0 commit comments

Comments
 (0)