Skip to content

Supporting release for Connect 2.4.7 #196

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 16 commits into from
Dec 11, 2018
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion .circleci/config.yml
Original file line number Diff line number Diff line change
Expand Up @@ -76,7 +76,7 @@ workflows:
- test
filters:
branches:
only: ['dev']
only: ['dev', 'feature/auth0-proxy-server']
- deployProd:
requires:
- test
Expand Down
10 changes: 6 additions & 4 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,9 +2,11 @@

Microservice to manage CRUD operations for all things Projects.

### Note : Steps mentioned below are best to our capability as guide for local deployment, however, we expect from contributor, being a developer, to resolve run-time issues (e.g. OS and node version issues etc), if any.

### Local Development
* We use docker-compose for running dependencies locally. Instructions for Docker compose setup - https://docs.docker.com/compose/install/
* Nodejs 6.9.4 - consider using [nvm](https://github.com/creationix/nvm) or equivalent to manage your node version
* Nodejs 8.9.4 - consider using [nvm](https://github.com/creationix/nvm) or equivalent to manage your node version
* Install [libpg](https://www.npmjs.com/package/pg-native)
* Install node dependencies
`npm install`
Expand Down Expand Up @@ -63,9 +65,9 @@ New Kafka related configuration options has been introduced:
}
```
Environment variables:
KAFKA_HOSTS - same as "hosts"
KAFKA_CLIENT_CERT - same as "clientCert"
KAFKA_CLIENT_CERT_KEY - same as "clientCertKey"
- `KAFKA_HOSTS` - same as "hosts"
- `KAFKA_CLIENT_CERT` - same as "clientCert"
- `KAFKA_CLIENT_CERT_KEY` - same as "clientCertKey"

### Test

Expand Down
37 changes: 18 additions & 19 deletions deploy.sh
Original file line number Diff line number Diff line change
Expand Up @@ -11,6 +11,7 @@ AWS_ECS_CONTAINER_NAME="tc-project-service"
AWS_REPOSITORY=$(eval "echo \$${ENV}_AWS_REPOSITORY")
AWS_ECS_CLUSTER=$(eval "echo \$${ENV}_AWS_ECS_CLUSTER")
AWS_ECS_SERVICE=$(eval "echo \$${ENV}_AWS_ECS_SERVICE")
AWS_ECS_SERVICE_CONSUMERS=$(eval "echo \$${ENV}_AWS_ECS_SERVICE_CONSUMERS")
AUTH_DOMAIN=$(eval "echo \$${ENV}_AUTH_DOMAIN")
AUTH_SECRET=$(eval "echo \$${ENV}_AUTH_SECRET")
VALID_ISSUERS=$(eval "echo \$${ENV}_VALID_ISSUERS")
Expand All @@ -29,9 +30,9 @@ configure_aws_cli() {
# deploys the app to the ecs cluster
deploy_cluster() {

make_task_def
register_definition
if [[ $(aws ecs update-service --cluster $AWS_ECS_CLUSTER --service $AWS_ECS_SERVICE --task-definition $revision | \
make_task_def $1 $2 $3 $4
register_definition $1
if [[ $(aws ecs update-service --cluster $AWS_ECS_CLUSTER --service $1 --task-definition $revision | \
$JQ '.service.taskDefinition') != $revision ]]; then
echo "Error updating service."
return 1
Expand All @@ -46,6 +47,7 @@ make_task_def(){
"family": "%s",
"requiresCompatibilities": ["EC2", "FARGATE"],
"networkMode": "awsvpc",
"taskRoleArn": "arn:aws:iam::%s:role/tc-project-service-ecs-task-role",
"executionRoleArn": "arn:aws:iam::%s:role/ecsTaskExecutionRole",
"cpu": "1024",
"memory": "2048",
Expand All @@ -56,6 +58,7 @@ make_task_def(){
"essential": true,
"memory": 1536,
"cpu": 768,
"entryPoint": ["%s", "%s", "%s"],
"environment": [
{
"name": "NODE_ENV",
Expand Down Expand Up @@ -85,14 +88,6 @@ make_task_def(){
"name": "AWS_REGION",
"value": "%s"
},
{
"name": "AWS_ACCESS_KEY_ID",
"value": "%s"
},
{
"name": "AWS_SECRET_ACCESS_KEY",
"value": "%s"
},
{
"name": "AUTH_DOMAIN",
"value": "%s"
Expand Down Expand Up @@ -255,16 +250,16 @@ make_task_def(){
KAFKA_CLIENT_CERT_KEY=$(eval "echo \$${ENV}_KAFKA_CLIENT_CERT_KEY")
KAFKA_GROUP_ID=$(eval "echo \$${ENV}_KAFKA_GROUP_ID")
KAFKA_URL=$(eval "echo \$${ENV}_KAFKA_URL")

AUTH0_PROXY_SERVER_URL=$(eval "echo \$${ENV}_AUTH0_PROXY_SERVER_URL")

AUTH0_PROXY_SERVER_URL=$(eval "echo \$${ENV}_AUTH0_PROXY_SERVER_URL")

task_def=$(printf "$task_template" $family $ACCOUNT_ID $AWS_ECS_CONTAINER_NAME $ACCOUNT_ID $AWS_REGION $AWS_REPOSITORY $CIRCLE_SHA1 $NODE_ENV $ENABLE_FILE_UPLOAD $LOG_LEVEL $CAPTURE_LOGS $LOGENTRIES_TOKEN $API_VERSION $AWS_REGION $AWS_ACCESS_KEY_ID $AWS_SECRET_ACCESS_KEY $AUTH_DOMAIN $AUTH_SECRET $VALID_ISSUERS $DB_MASTER_URL $MEMBER_SERVICE_ENDPOINT $IDENTITY_SERVICE_ENDPOINT $BUS_API_URL $MESSAGE_SERVICE_URL $SYSTEM_USER_CLIENT_ID $SYSTEM_USER_CLIENT_SECRET $PROJECTS_ES_URL $PROJECTS_ES_INDEX_NAME $RABBITMQ_URL $DIRECT_PROJECT_SERVICE_ENDPOINT $FILE_SERVICE_ENDPOINT $CONNECT_PROJECTS_URL $SEGMENT_ANALYTICS_KEY "$AUTH0_URL" "$AUTH0_AUDIENCE" $AUTH0_CLIENT_ID "$AUTH0_CLIENT_SECRET" $TOKEN_CACHE_TIME "$KAFKA_CLIENT_CERT" "$KAFKA_CLIENT_CERT_KEY" $KAFKA_GROUP_ID $KAFKA_URL "$AUTH0_PROXY_SERVER_URL" $PORT $PORT $AWS_ECS_CLUSTER $AWS_REGION $NODE_ENV)
task_def=$(printf "$task_template" $1 $ACCOUNT_ID $ACCOUNT_ID $AWS_ECS_CONTAINER_NAME $ACCOUNT_ID $AWS_REGION $AWS_REPOSITORY $CIRCLE_SHA1 $2 $3 $4 $NODE_ENV $ENABLE_FILE_UPLOAD $LOG_LEVEL $CAPTURE_LOGS $LOGENTRIES_TOKEN $API_VERSION $AWS_REGION $AUTH_DOMAIN $AUTH_SECRET $VALID_ISSUERS $DB_MASTER_URL $MEMBER_SERVICE_ENDPOINT $IDENTITY_SERVICE_ENDPOINT $BUS_API_URL $MESSAGE_SERVICE_URL $SYSTEM_USER_CLIENT_ID $SYSTEM_USER_CLIENT_SECRET $PROJECTS_ES_URL $PROJECTS_ES_INDEX_NAME $RABBITMQ_URL $DIRECT_PROJECT_SERVICE_ENDPOINT $FILE_SERVICE_ENDPOINT $CONNECT_PROJECTS_URL $SEGMENT_ANALYTICS_KEY "$AUTH0_URL" "$AUTH0_AUDIENCE" $AUTH0_CLIENT_ID "$AUTH0_CLIENT_SECRET" $TOKEN_CACHE_TIME "$KAFKA_CLIENT_CERT" "$KAFKA_CLIENT_CERT_KEY" $KAFKA_GROUP_ID $KAFKA_URL "$AUTH0_PROXY_SERVER_URL" $PORT $PORT $AWS_ECS_CLUSTER $AWS_REGION $NODE_ENV)
}

push_ecr_image(){
eval $(aws ecr get-login --region $AWS_REGION --no-include-email)
docker push $ACCOUNT_ID.dkr.ecr.$AWS_REGION.amazonaws.com/$AWS_REPOSITORY:$CIRCLE_SHA1
docker push $ACCOUNT_ID.dkr.ecr.$AWS_REGION.amazonaws.com/$1:$CIRCLE_SHA1
}

register_definition() {
Expand All @@ -279,13 +274,13 @@ register_definition() {
check_service_status() {
counter=0
sleep 60
servicestatus=`aws ecs describe-services --service $AWS_ECS_SERVICE --cluster $AWS_ECS_CLUSTER | $JQ '.services[].events[0].message'`
servicestatus=`aws ecs describe-services --service $1 --cluster $AWS_ECS_CLUSTER | $JQ '.services[].events[0].message'`
while [[ $servicestatus != *"steady state"* ]]
do
echo "Current event message : $servicestatus"
echo "Waiting for 30 seconds to check the service status...."
sleep 30
servicestatus=`aws ecs describe-services --service $AWS_ECS_SERVICE --cluster $AWS_ECS_CLUSTER | $JQ '.services[].events[0].message'`
servicestatus=`aws ecs describe-services --service $1 --cluster $AWS_ECS_CLUSTER | $JQ '.services[].events[0].message'`
counter=`expr $counter + 1`
if [[ $counter -gt $COUNTER_LIMIT ]] ; then
echo "Service does not reach steady state within 10 minutes. Please check"
Expand All @@ -296,6 +291,10 @@ check_service_status() {
}

configure_aws_cli
push_ecr_image
deploy_cluster
check_service_status
push_ecr_image $AWS_REPOSITORY
deploy_cluster $AWS_ECS_SERVICE "npm" "run" "start"

deploy_cluster $AWS_ECS_SERVICE_CONSUMERS "npm" "run" "startKafkaConsumers"

check_service_status $AWS_ECS_SERVICE
check_service_status $AWS_ECS_SERVICE_CONSUMERS
3 changes: 2 additions & 1 deletion package.json
Original file line number Diff line number Diff line change
Expand Up @@ -15,6 +15,7 @@
"migrate:es": "./node_modules/.bin/babel-node migrations/seedElasticsearchIndex.js",
"prestart": "npm run -s build",
"start": "node dist",
"startKafkaConsumers": "npm run -s build && node dist/index-kafka.js",
"start:dev": "NODE_ENV=development PORT=8001 nodemon -w src --exec \"babel-node src --presets es2015\" | ./node_modules/.bin/bunyan",
"test": "NODE_ENV=test npm run lint && NODE_ENV=test npm run sync:es && NODE_ENV=test npm run sync:db && NODE_ENV=test ./node_modules/.bin/istanbul cover ./node_modules/mocha/bin/_mocha -- --timeout 10000 --compilers js:babel-core/register $(find src -path '*spec.js*')",
"test:watch": "NODE_ENV=test ./node_modules/.bin/mocha -w --compilers js:babel-core/register $(find src -path '*spec.js*')",
Expand Down Expand Up @@ -49,7 +50,7 @@
"express-request-id": "^1.1.0",
"express-sanitizer": "^1.0.2",
"express-validation": "^0.6.0",
"http-aws-es": "^1.1.3",
"http-aws-es": "^4.0.0",
"joi": "^8.0.5",
"jsonwebtoken": "^8.3.0",
"lodash": "^4.16.4",
Expand Down
2 changes: 1 addition & 1 deletion src/constants.js
Original file line number Diff line number Diff line change
Expand Up @@ -87,7 +87,7 @@ export const BUS_API_EVENT = {

PROJECT_LINK_CREATED: 'notifications.connect.project.linkCreated',
PROJECT_FILE_UPLOADED: 'notifications.connect.project.fileUploaded',
PROJECT_SPECIFICATION_MODIFIED: 'notifications.connect.project.specificationModified',
PROJECT_SPECIFICATION_MODIFIED: 'connect.action.project.updated.spec',
PROJECT_PROGRESS_MODIFIED: 'connect.action.project.updated.progress',
PROJECT_FILES_UPDATED: 'connect.action.project.files.updated',
PROJECT_TEAM_UPDATED: 'connect.action.project.team.updated',
Expand Down
63 changes: 57 additions & 6 deletions src/events/projectPhases/index.js
Original file line number Diff line number Diff line change
Expand Up @@ -107,15 +107,14 @@ const projectPhaseAddedHandler = Promise.coroutine(function* (logger, msg, chann
});

/**
* Handler for project phase updated event
* Indexes the project phase in the elastic search.
*
* @param {Object} logger logger to log along with trace id
* @param {Object} msg event payload
* @param {Object} channel channel to ack, nack
* @param {Object} data event payload
* @returns {undefined}
*/
const projectPhaseUpdatedHandler = Promise.coroutine(function* (logger, msg, channel) { // eslint-disable-line func-names
const updateIndexProjectPhase = Promise.coroutine(function* (logger, data) { // eslint-disable-line func-names
try {
const data = JSON.parse(msg.content.toString());
const doc = yield eClient.get({ index: ES_PROJECT_INDEX, type: ES_PROJECT_TYPE, id: data.original.projectId });
const phases = _.map(data.allPhases, single => _.omit(single, ['deletedAt', 'deletedBy']));
const merged = _.assign(doc._source, { phases }); // eslint-disable-line no-underscore-dangle
Expand All @@ -127,7 +126,59 @@ const projectPhaseUpdatedHandler = Promise.coroutine(function* (logger, msg, cha
doc: merged,
},
});
logger.debug('elasticsearch index updated, project phase updated successfully');
logger.debug('project phase updated to project document successfully');
} catch (error) {
logger.error('Error handling indexing the project phase', error);
// throw the error back to nack the bus
throw error;
}
});

/**
* Creates a new phase topic in message api.
*
* @param {Object} logger logger to log along with trace id
* @param {Object} msg event payload
* @returns {undefined}
*/
const updatePhaseTopic = Promise.coroutine(function* (logger, phase) { // eslint-disable-line func-names
try {
logger.debug('Updating topic for phase with phase', phase);
const topic = yield messageService.getPhaseTopic(phase.projectId, phase.id, logger);
logger.trace('Topic', topic);
const title = phase.name;
const titleChanged = topic && topic.title !== title;
logger.trace('titleChanged', titleChanged);
const contentPost = topic && topic.posts && topic.posts.length > 0 ? topic.posts[0] : null;
logger.trace('contentPost', contentPost);
const postId = _.get(contentPost, 'id');
const content = _.get(contentPost, 'body');
if (postId && content && titleChanged) {
const updatedTopic = yield messageService.updateTopic(topic.id, { title, postId, content }, logger);
logger.debug('topic for the phase updated successfully');
logger.trace('updated topic', updatedTopic);
}
} catch (error) {
logger.error('Error in updating topic for the project phase', error);
// don't throw the error back to nack the bus, because we don't want to get multiple topics per phase
// we can create topic for a phase manually, if somehow it fails
}
});

/**
* Handler for project phase updated event
* @param {Object} logger logger to log along with trace id
* @param {Object} msg event payload
* @param {Object} channel channel to ack, nack
* @returns {undefined}
*/
const projectPhaseUpdatedHandler = Promise.coroutine(function* (logger, msg, channel) { // eslint-disable-line func-names
try {
const data = JSON.parse(msg.content.toString());
logger.debug('calling updateIndexProjectPhase', data);
yield updateIndexProjectPhase(logger, data, channel);
logger.debug('calling updatePhaseTopic', data.updated);
yield updatePhaseTopic(logger, data.updated);
channel.ack(msg);
} catch (error) {
logger.error('Error handling project.phase.updated event', error);
Expand Down
51 changes: 51 additions & 0 deletions src/index-kafka.js
Original file line number Diff line number Diff line change
@@ -0,0 +1,51 @@
import _ from 'lodash';
import config from 'config';
import startKafkaConsumer from './services/kafkaConsumer';
import { kafkaHandlers } from './events';
import models from './models';

const coreLib = require('tc-core-library-js');


// =======================
// Loger =========
// =======================
let appName = 'tc-projects-consumer';
switch (process.env.NODE_ENV.toLowerCase()) {
case 'development':
appName += '-dev';
break;
case 'qa':
appName += '-qa';
break;
case 'production':
default:
appName += '-prod';
break;
}

const logger = coreLib.logger({
name: appName,
level: _.get(config, 'logLevel', 'debug').toLowerCase(),
captureLogs: config.get('captureLogs'),
logentriesToken: _.get(config, 'logentriesToken', null),
});

// =======================
// Database =========
// =======================
logger.info('Registering models ... ', !!models);

/**
* Handle server shutdown gracefully
* @returns {undefined}
*/
function gracefulShutdown() {
// TODO
}
process.on('SIGTERM', gracefulShutdown);
process.on('SIGINT', gracefulShutdown);

const app = { logger, models };

module.exports = startKafkaConsumer(kafkaHandlers, app, logger);
12 changes: 6 additions & 6 deletions src/services/index.js
Original file line number Diff line number Diff line change
Expand Up @@ -2,8 +2,8 @@

import config from 'config';
import RabbitMQService from './rabbitmq';
import startKafkaConsumer from './kafkaConsumer';
import { kafkaHandlers } from '../events';
// import startKafkaConsumer from './kafkaConsumer';
// import { kafkaHandlers } from '../events';

/**
* Responsible for establishing connections to all external services
Expand Down Expand Up @@ -33,10 +33,10 @@ module.exports = (fapp, logger) => {
.then(() => {
logger.info('RabbitMQ service initialized');
})
.then(() => startKafkaConsumer(kafkaHandlers, app, logger))
.then(() => {
logger.info('Kafka consumer service initialized');
})
// .then(() => startKafkaConsumer(kafkaHandlers, app, logger))
// .then(() => {
// logger.info('Kafka consumer service initialized');
// })
.catch((err) => {
logger.error('Error initializing services', err);
// gracefulShutdown()
Expand Down
39 changes: 35 additions & 4 deletions src/services/messageService.js
Original file line number Diff line number Diff line change
Expand Up @@ -87,6 +87,35 @@ function createTopic(topic, logger) {
});
}

/**
* Updates the given topic in message api
*
* @param {Number} topicId id of the topic to be updated
* @param {Object} topic the topic, should be a JSON object
* @param {Object} logger object
* @return {Promise} new topic promise
*/
function updateTopic(topicId, topic, logger) {
logger.debug(`updateTopic for topic: ${JSON.stringify(topic)}`);
return getClient(logger).then((msgClient) => {
logger.debug('calling message service');
return msgClient.post(`/topics/${topicId}/edit`, topic)
.then((resp) => {
logger.debug('Topic updated successfully');
logger.debug(`Topic updated successfully [status]: ${resp.status}`);
logger.debug(`Topic updated successfully [data]: ${resp.data}`);
return _.get(resp.data, 'result.content', {});
})
.catch((error) => {
logger.debug('Error updating topic');
logger.error(error);
// eslint-disable-line
});
}).catch((errMessage) => {
logger.debug(errMessage);
});
}

/**
* Deletes the given posts for the given topic.
*
Expand Down Expand Up @@ -121,12 +150,13 @@ function deletePosts(topicId, postIds, logger) {
* @return {Promise} topic promise
*/
function getPhaseTopic(projectId, phaseId, logger) {
logger.debug(`getPhaseTopic for phaseId: ${phaseId}`);
logger.debug(`getPhaseTopic for projectId: ${projectId} phaseId: ${phaseId}`);
return getClient(logger).then((msgClient) => {
logger.debug(`calling message service for fetching phaseId#${phaseId}`);
return msgClient.get('/topics/list', {
params: { filter: `reference=project&referenceId=${projectId}&tag=phase#${phaseId}` },
}).then((resp) => {
const encodedFilter = encodeURIComponent(`reference=project&referenceId=${projectId}&tag=phase#${phaseId}`);
return msgClient.get(`/topics/list/db?filter=${encodedFilter}`)
.then((resp) => {
logger.debug('Fetched phase topic', resp);
const topics = _.get(resp.data, 'result.content', []);
if (topics && topics.length > 0) {
return topics[0];
Expand All @@ -153,6 +183,7 @@ function deleteTopic(topicId, logger) {

module.exports = {
createTopic,
updateTopic,
deletePosts,
getPhaseTopic,
deleteTopic,
Expand Down
Loading