Skip to content

Commit 8478d46

Browse files
Merge pull request #24 from topcoder-platform/develop
sync with develop
2 parents 66a12d5 + 03d5a17 commit 8478d46

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

76 files changed

+25926
-2460
lines changed

.eslintrc.yaml

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1 @@
1+
extends: standard

README.md

Lines changed: 127 additions & 28 deletions
Original file line numberDiff line numberDiff line change
@@ -4,6 +4,8 @@
44

55
- nodejs https://nodejs.org/en/ (v10)
66
- DynamoDB
7+
- AWS S3
8+
- Elasticsearch v6
79
- Docker, Docker Compose
810

911
## Configuration
@@ -15,49 +17,146 @@ The following parameters can be set in config files or in env variables:
1517
- PORT: the server port, default is 3000
1618
- AUTH_SECRET: The authorization secret used during token verification.
1719
- VALID_ISSUERS: The valid issuer of tokens.
18-
- DYNAMODB.AWS_ACCESS_KEY_ID: The Amazon certificate key to use when connecting. Use local dynamodb you can set fake value
19-
- DYNAMODB.AWS_SECRET_ACCESS_KEY: The Amazon certificate access key to use when connecting. Use local dynamodb you can set fake value
20-
- DYNAMODB.AWS_REGION: The Amazon certificate region to use when connecting. Use local dynamodb you can set fake value
21-
- DYNAMODB.IS_LOCAL: Use Amazon DynamoDB Local or server.
22-
- DYNAMODB.URL: The local url if using Amazon DynamoDB Local
20+
- AUTH0_URL: AUTH0 URL, used to get M2M token
21+
- AUTH0_PROXY_SERVER_URL: AUTH0 proxy server URL, used to get M2M token
22+
- AUTH0_AUDIENCE: AUTH0 audience, used to get M2M token
23+
- TOKEN_CACHE_TIME: AUTH0 token cache time, used to get M2M token
24+
- AUTH0_CLIENT_ID: AUTH0 client id, used to get M2M token
25+
- AUTH0_CLIENT_SECRET: AUTH0 client secret, used to get M2M token
26+
- BUSAPI_URL: Bus API URL
27+
- KAFKA_ERROR_TOPIC: Kafka error topic used by bus API wrapper
28+
- AMAZON.AWS_ACCESS_KEY_ID: The Amazon certificate key to use when connecting. Use local dynamodb you can set fake value
29+
- AMAZON.AWS_SECRET_ACCESS_KEY: The Amazon certificate access key to use when connecting. Use local dynamodb you can set fake value
30+
- AMAZON.AWS_REGION: The Amazon certificate region to use when connecting. Use local dynamodb you can set fake value
31+
- AMAZON.IS_LOCAL_DB: Use Amazon DynamoDB Local or server.
32+
- AMAZON.DYNAMODB_URL: The local url if using Amazon DynamoDB Local
33+
- AMAZON.ATTACHMENT_S3_BUCKET: the AWS S3 bucket to store attachments
34+
- ES: config object for Elasticsearch
35+
- ES.HOST: Elasticsearch host
36+
- ES.API_VERSION: Elasticsearch API version
37+
- ES.ES_INDEX: Elasticsearch index name
38+
- ES.ES_TYPE: Elasticsearch index type
39+
- ES.ES_REFRESH: Elasticsearch refresh method. Default to string `true`(i.e. refresh immediately)
40+
- FILE_UPLOAD_SIZE_LIMIT: the file upload size limit in bytes
41+
- RESOURCES_API_URL: TC resources API base URL
42+
- GROUPS_API_URL: TC groups API base URL
43+
- PROJECTS_API_URL: TC projects API base URL
44+
- COPILOT_RESOURCE_ROLE_IDS: copilot resource role ids allowed to upload attachment
45+
- HEALTH_CHECK_TIMEOUT: health check timeout in milliseconds
46+
- SCOPES: the configurable M2M token scopes, refer `config/default.js` for more details
47+
- M2M_AUDIT_HANDLE: the audit name used when perform create/update operation using M2M token
2348

24-
## DynamoDB Setup with Docker
25-
We will use DynamoDB setup on Docker.
49+
Set the following environment variables so that the app can get TC M2M token (use 'set' insted of 'export' for Windows OS):
2650

27-
Just run `docker-compose up` in local folder
51+
- export AUTH0_CLIENT_ID=8QovDh27SrDu1XSs68m21A1NBP8isvOt
52+
- export AUTH0_CLIENT_SECRET=3QVxxu20QnagdH-McWhVz0WfsQzA1F8taDdGDI4XphgpEYZPcMTF4lX3aeOIeCzh
53+
- export AUTH0_URL=https://topcoder-dev.auth0.com/oauth/token
54+
- export AUTH0_AUDIENCE=https://m2m.topcoder-dev.com/
2855

29-
If you have already installed aws-cli in your local machine, you can execute `./local/init-dynamodb.sh` to
30-
create the table. If not you can still create table following `Create Table via awscli in Docker`.
56+
Also properly configure AWS_ACCESS_KEY_ID, AWS_SECRET_ACCESS_KEY, AWS_REGION, ATTACHMENT_S3_BUCKET, IS_LOCAL_DB config parameters.
3157

32-
## Create Table via awscli in Docker
58+
Test configuration is at `config/test.js`. You don't need to change them.
59+
The following test parameters can be set in config file or in env variables:
60+
61+
- ADMIN_TOKEN: admin token
62+
- COPILOT_TOKEN: copilot token
63+
- USER_TOKEN: user token
64+
- EXPIRED_TOKEN: expired token
65+
- INVALID_TOKEN: invalid token
66+
- M2M_FULL_ACCESS_TOKEN: M2M full access token
67+
- M2M_READ_ACCESS_TOKEN: M2M read access token
68+
- M2M_UPDATE_ACCESS_TOKEN: M2M update (including 'delete') access token
69+
- S3_ENDPOINT: endpoint of AWS S3 API, for unit and e2e test only; default to `localhost:9000`
70+
71+
## AWS S3 Setup
72+
Go to https://console.aws.amazon.com/ and login. Choose S3 from Service folder and click `Create bucket`. Following the instruction to create S3 bucket.
73+
74+
## Local services setup
75+
In the `local` folder, run `docker-compose up`
76+
It starts Elasticsearch, DynamoDB and S3 compatible server.
77+
78+
## Mock api
79+
For postman verification, please use the mock api under mock-api folder. It provides mock endpoint to fetch challenge resources and groups.
80+
You need to ensure DynamoDB configuration in `mock-api/config/default.js` is consistent with `config/default.js`
81+
Go to `mock-api` folder and run command `npm run start` to start the mock-api listening on port 4000
82+
83+
## Create Tables
3384
1. Make sure DynamoDB are running as per instructions above.
85+
2. Make sure you have configured all config parameters. Refer [Configuration](#configuration)
86+
3. Run `npm run create-tables` to create tables.
3487

35-
2. Run the following commands
36-
```
37-
docker exec -ti dynamodb sh
38-
```
39-
Next
40-
```
41-
./init-dynamodb.sh
42-
```
88+
## Scripts
89+
1. Drop/delete tables: `npm run drop-tables`
90+
2. Creating tables: `npm run create-tables`
91+
3. Seed/Insert data to tables: `npm run seed-tables`
92+
4. Initialize/Clear database in default environment: `npm run init-db`
93+
5. View table data in default environment: `npm run view-data <ModelName>`, ModelName can be `Challenge`, `ChallengeType`, `ChallengeSetting`, `AuditLog`, `Phase`, `TimelineTemplate`or `Attachment`
94+
6. Create Elasticsearch index: `npm run init-db`, or to re-create index: `npm run init-db force`
95+
7. Synchronize ES data and DynamoDB data: `npm run sync-es`
4396

44-
3. Now the tables have been created, you can use following command to verify
45-
```
46-
aws dynamodb scan --table-name Challenge --endpoint-url http://localhost:7777
47-
aws dynamodb scan --table-name ChallengeType --endpoint-url http://localhost:7777
48-
aws dynamodb scan --table-name ChallengeSetting --endpoint-url http://localhost:7777
49-
aws dynamodb scan --table-name AuditLog --endpoint-url http://localhost:7777
50-
```
97+
### Notes
98+
- The seed data are located in `src/scripts/seed`
5199

52100
## Local Deployment
53101

54102
- Install dependencies `npm install`
55103
- Run lint `npm run lint`
56104
- Run lint fix `npm run lint:fix`
105+
- initialize Elasticsearch, create configured Elasticsearch index if not present: `npm run init-es`,
106+
or re-create the index: `npm run init-es force`
107+
- Create tables `npm run create-tables`
108+
- Clear and init db `npm run init-db`
57109
- Start app `npm start`
58110
- App is running at `http://localhost:3000`
59-
- Clear and init db `npm run init-db`
60-
- Insert test data `npm run test-data`
111+
- Start mock-api, go to `mock-api` folder and `npm start`, mock api is running at `http://localhost:4000`
112+
113+
## Running tests
114+
115+
### Prepare
116+
- Start Local services.
117+
- Start Mock API.
118+
- Create DynamoDB tables.
119+
- Initialize ES index.
120+
- Various config parameters should be properly set.
121+
122+
Seeding db data is not needed.
123+
124+
### Running unit tests
125+
To run unit tests alone
126+
127+
```bash
128+
npm run test
129+
```
130+
131+
To run unit tests with coverage report
132+
133+
```bash
134+
npm run test:cov
135+
```
136+
137+
### Running integration tests
138+
To run integration tests alone
139+
140+
```bash
141+
npm run e2e
142+
```
143+
144+
To run integration tests with coverage report
145+
146+
```bash
147+
npm run e2e:cov
148+
```
61149

62150
## Verification
63151
Refer to the verification document `Verification.md`
152+
153+
## Notes
154+
155+
- after uploading attachments, the returned attachment ids should be used to update challenge;
156+
finally, attachments have challengeId field linking to their challenge,
157+
challenge also have attachments field linking to its attachments,
158+
this will speed up challenge CRUDS operations.
159+
160+
- In the app-constants.js Topics field, the used topics are using a test topic,
161+
the suggested ones are commented out, because these topics are not created in TC dev Kafka yet.
162+

Verification.md

Lines changed: 18 additions & 11 deletions
Original file line numberDiff line numberDiff line change
@@ -1,17 +1,24 @@
11
# TopCoder Challenge API Verification
22

33
## Postman tests
4+
- clear the environment, run command `npm run init-db` and `npm run init-es force`
45
- import Postman collection and environment in the docs folder to Postman
5-
- note that the Postman tests depend on the test data, so you must first run `npm run init-db` and `npm run test-data` to setup test data
6-
- Just run the whole test cases under provided environment.
6+
- run tests from up to down in order
7+
- You need to run command `npm run sync-es` before you run `Challenges/get challenge` and `Challenges/search challenge` test case.
78

89
## DynamoDB Verification
9-
1. Open a new console and run the command `docker exec -ti dynamodb sh` to use `aws-cli`
10-
11-
2. On the console you opened in step 1, run these following commands you can verify the data that inserted into database during the executing of postman tests
12-
```
13-
aws dynamodb scan --table-name Challenge --endpoint-url http://localhost:7777
14-
aws dynamodb scan --table-name ChallengeType --endpoint-url http://localhost:7777
15-
aws dynamodb scan --table-name ChallengeSetting --endpoint-url http://localhost:7777
16-
aws dynamodb scan --table-name AuditLog --endpoint-url http://localhost:7777
17-
```
10+
Run command `npm run view-data <ModelName>` to view table data, ModelName can be `Challenge`, `ChallengeType`, `ChallengeSetting`, `AuditLog`, `Phase`, `TimelineTemplate`or `Attachment`
11+
12+
## S3 Verification
13+
14+
Login to AWS Console, S3 service, view the bucket content.
15+
16+
## ElasticSearch Verification
17+
18+
Run command `npm run view-es-data` to view data store in ES.
19+
20+
## Bus Event Verification
21+
22+
- login `https://lauscher.topcoder-dev.com/` with credential `tonyj / appirio123`
23+
- then select topic to view, see app-constants.js Topics field for used topics, then click `View` button to view related messages
24+

app-bootstrap.js

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -4,7 +4,7 @@
44
global.Promise = require('bluebird')
55
const Joi = require('joi')
66

7-
Joi.optionalId = () => Joi.string()
7+
Joi.optionalId = () => Joi.string().uuid()
88
Joi.id = () => Joi.optionalId().required()
99
Joi.page = () => Joi.number().integer().min(1).default(1)
1010
Joi.perPage = () => Joi.number().integer().min(1).max(100).default(20)

app-constants.js

Lines changed: 40 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -6,6 +6,45 @@ const UserRoles = {
66
Copilot: 'Copilot'
77
}
88

9+
const prizeSetTypes = {
10+
ChallengePrizes: 'Challenge prizes',
11+
CopilotPayment: 'Copilot payment',
12+
ReviewerPayment: 'Reviewer payment',
13+
CheckpointPrizes: 'Checkpoint prizes'
14+
}
15+
16+
const challengeStatuses = {
17+
Draft: 'Draft',
18+
Canceled: 'Canceled',
19+
Active: 'Active',
20+
Completed: 'Completed'
21+
}
22+
23+
const EVENT_ORIGINATOR = 'topcoder-challenges-api'
24+
25+
const EVENT_MIME_TYPE = 'application/json'
26+
27+
// using a testing topc, should be changed to use real topics in comments when they are created
28+
const Topics = {
29+
ChallengeCreated: 'challenge.notification.create',
30+
ChallengeUpdated: 'challenge.notification.update',
31+
ChallengeTypeCreated: 'test.new.bus.events', // 'challenge.action.type.created',
32+
ChallengeTypeUpdated: 'test.new.bus.events', // 'challenge.action.type.updated',
33+
ChallengeSettingCreated: 'test.new.bus.events', // 'challenge.action.setting.created',
34+
ChallengeSettingUpdated: 'test.new.bus.events', // 'challenge.action.setting.updated',
35+
ChallengePhaseCreated: 'test.new.bus.events', // 'challenge.action.phase.created',
36+
ChallengePhaseUpdated: 'test.new.bus.events', // 'challenge.action.phase.updated',
37+
ChallengePhaseDeleted: 'test.new.bus.events', // 'challenge.action.phase.deleted',
38+
TimelineTemplateCreated: 'test.new.bus.events', // 'challenge.action.timeline.template.created',
39+
TimelineTemplateUpdated: 'test.new.bus.events', // 'challenge.action.timeline.template.updated',
40+
TimelineTemplateDeleted: 'test.new.bus.events' // 'challenge.action.timeline.template.deleted'
41+
}
42+
943
module.exports = {
10-
UserRoles
44+
UserRoles,
45+
prizeSetTypes,
46+
challengeStatuses,
47+
EVENT_ORIGINATOR,
48+
EVENT_MIME_TYPE,
49+
Topics
1150
}

app-routes.js

Lines changed: 42 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -30,29 +30,67 @@ module.exports = (app) => {
3030
next()
3131
})
3232

33-
// add Authenticator check if route has auth
33+
actions.push((req, res, next) => {
34+
if (_.get(req, 'query.token')) {
35+
_.set(req, 'headers.authorization', `Bearer ${_.trim(req.query.token)}`)
36+
}
37+
next()
38+
})
39+
3440
if (def.auth) {
41+
// add Authenticator/Authorization check if route has auth
3542
actions.push((req, res, next) => {
3643
authenticator(_.pick(config, ['AUTH_SECRET', 'VALID_ISSUERS']))(req, res, next)
3744
})
3845

3946
actions.push((req, res, next) => {
4047
if (req.authUser.isMachine) {
41-
next(new errors.ForbiddenError('M2M is not supported.'))
48+
// M2M
49+
if (!req.authUser.scopes || !helper.checkIfExists(def.scopes, req.authUser.scopes)) {
50+
next(new errors.ForbiddenError('You are not allowed to perform this action!'))
51+
} else {
52+
req.authUser.handle = config.M2M_AUDIT_HANDLE
53+
next()
54+
}
4255
} else {
4356
req.authUser.userId = String(req.authUser.userId)
44-
// User
57+
// User roles authorization
4558
if (req.authUser.roles) {
46-
if (!helper.checkIfExists(def.access, req.authUser.roles)) {
59+
if (def.access && !helper.checkIfExists(def.access, req.authUser.roles)) {
4760
next(new errors.ForbiddenError('You are not allowed to perform this action!'))
4861
} else {
62+
// user token is used in create/update challenge to ensure user can create/update challenge under specific project
63+
req.userToken = req.headers.authorization.split(' ')[1]
4964
next()
5065
}
5166
} else {
5267
next(new errors.ForbiddenError('You are not authorized to perform this action'))
5368
}
5469
}
5570
})
71+
} else {
72+
// public API, but still try to authenticate token if provided, but allow missing/invalid token
73+
actions.push((req, res, next) => {
74+
const interceptRes = {}
75+
interceptRes.status = () => interceptRes
76+
interceptRes.json = () => interceptRes
77+
interceptRes.send = () => next()
78+
authenticator(_.pick(config, ['AUTH_SECRET', 'VALID_ISSUERS']))(req, interceptRes, next)
79+
})
80+
81+
actions.push((req, res, next) => {
82+
if (!req.authUser) {
83+
next()
84+
} else if (req.authUser.isMachine) {
85+
if (!def.scopes || !req.authUser.scopes || !helper.checkIfExists(def.scopes, req.authUser.scopes)) {
86+
req.authUser = undefined
87+
}
88+
next()
89+
} else {
90+
req.authUser.userId = String(req.authUser.userId)
91+
next()
92+
}
93+
})
5694
}
5795

5896
actions.push(method)

0 commit comments

Comments
 (0)