|
| 1 | +# Topcoder Member API v5 |
| 2 | + |
| 3 | +## Dependencies |
| 4 | + |
| 5 | +- nodejs https://nodejs.org/en/ (v10) |
| 6 | +- DynamoDB |
| 7 | +- AWS S3 |
| 8 | +- Elasticsearch v6 |
| 9 | +- Docker, Docker Compose |
| 10 | + |
| 11 | +## Configuration |
| 12 | + |
| 13 | +Configuration for the application is at `config/default.js`. |
| 14 | +The following parameters can be set in config files or in env variables: |
| 15 | + |
| 16 | +- LOG_LEVEL: the log level, default is 'debug' |
| 17 | +- PORT: the server port, default is 3000 |
| 18 | +- AUTH_SECRET: The authorization secret used during token verification. |
| 19 | +- VALID_ISSUERS: The valid issuer of tokens. |
| 20 | +- AUTH0_URL: AUTH0 URL, used to get M2M token |
| 21 | +- AUTH0_PROXY_SERVER_URL: AUTH0 proxy server URL, used to get M2M token |
| 22 | +- AUTH0_AUDIENCE: AUTH0 audience, used to get M2M token |
| 23 | +- TOKEN_CACHE_TIME: AUTH0 token cache time, used to get M2M token |
| 24 | +- AUTH0_CLIENT_ID: AUTH0 client id, used to get M2M token |
| 25 | +- AUTH0_CLIENT_SECRET: AUTH0 client secret, used to get M2M token |
| 26 | +- BUSAPI_URL: Bus API URL |
| 27 | +- KAFKA_ERROR_TOPIC: Kafka error topic used by bus API wrapper |
| 28 | +- AMAZON.AWS_ACCESS_KEY_ID: The Amazon certificate key to use when connecting. Use local dynamodb you can set fake value |
| 29 | +- AMAZON.AWS_SECRET_ACCESS_KEY: The Amazon certificate access key to use when connecting. Use local dynamodb you can set fake value |
| 30 | +- AMAZON.AWS_REGION: The Amazon certificate region to use when connecting. Use local dynamodb you can set fake value |
| 31 | +- AMAZON.IS_LOCAL_DB: Use Amazon DynamoDB Local or server. |
| 32 | +- AMAZON.DYNAMODB_URL: The local url if using Amazon DynamoDB Local |
| 33 | +- AMAZON.PHOTO_S3_BUCKET: the AWS S3 bucket to store photos |
| 34 | +- AMAZON.S3_API_VERSION: the AWS S3 API version |
| 35 | +- ES: config object for Elasticsearch |
| 36 | +- ES.HOST: Elasticsearch host |
| 37 | +- ES.API_VERSION: Elasticsearch API version |
| 38 | +- ES.ES_INDEX: Elasticsearch index name |
| 39 | +- ES.ES_TYPE: Elasticsearch index type |
| 40 | +- FILE_UPLOAD_SIZE_LIMIT: the file upload size limit in bytes |
| 41 | +- ID_FIELDS: member identifiable info fields, only admin, M2M, or member himself can get these fields |
| 42 | +- PHOTO_URL_TEMPLATE: photo URL template, its <key> will be replaced with S3 object key |
| 43 | +- VERIFY_TOKEN_EXPIRATION: verify token expiration in minutes |
| 44 | +- EMAIL_VERIFY_AGREE_URL: email verify agree URL, the <emailVerifyToken> will be replaced with generated verify token |
| 45 | +- EMAIL_VERIFY_DISAGREE_URL: email verify disagree URL |
| 46 | +- SCOPES: the configurable M2M token scopes, refer `config/default.js` for more details |
| 47 | + |
| 48 | + |
| 49 | +Set the following environment variables used by bus API to get TC M2M token (use 'set' insted of 'export' for Windows OS): |
| 50 | +``` |
| 51 | +export AUTH0_CLIENT_ID=8QovDh27SrDu1XSs68m21A1NBP8isvOt |
| 52 | +export AUTH0_CLIENT_SECRET=3QVxxu20QnagdH-McWhVz0WfsQzA1F8taDdGDI4XphgpEYZPcMTF4lX3aeOIeCzh |
| 53 | +export AUTH0_URL=https://topcoder-dev.auth0.com/oauth/token |
| 54 | +export AUTH0_AUDIENCE=https://m2m.topcoder-dev.com/ |
| 55 | +``` |
| 56 | + |
| 57 | +Also properly configure AWS_ACCESS_KEY_ID, AWS_SECRET_ACCESS_KEY, AWS_REGION, ATTACHMENT_S3_BUCKET, IS_LOCAL_DB config parameters. |
| 58 | + |
| 59 | +Test configuration is at `config/test.js`. You don't need to change them. |
| 60 | +The following test parameters can be set in config file or in env variables: |
| 61 | + |
| 62 | +- ADMIN_TOKEN: admin token |
| 63 | +- USER_TOKEN: user token |
| 64 | +- EXPIRED_TOKEN: expired token |
| 65 | +- INVALID_TOKEN: invalid token |
| 66 | +- M2M_FULL_ACCESS_TOKEN: M2M full access token |
| 67 | +- M2M_READ_ACCESS_TOKEN: M2M read access token |
| 68 | +- M2M_UPDATE_ACCESS_TOKEN: M2M update (including 'delete') access token |
| 69 | +- S3_ENDPOINT: endpoint of AWS S3 API, for unit and e2e test only; default to `localhost:9000` |
| 70 | + |
| 71 | +## AWS S3 Setup |
| 72 | +Go to https://console.aws.amazon.com/ and login. Choose S3 from Service folder and click `Create bucket`. Following the instruction to create S3 bucket. |
| 73 | +After creating bucket, click `Permissions` tab of the bucket, in the `Block public access` section, disable the block, so that public access |
| 74 | +is allowed, then we can upload public accessible photo to the bucket. |
| 75 | + |
| 76 | +## Local services setup |
| 77 | +In the `local` folder, run `docker-compose up` |
| 78 | +It starts Elasticsearch, DynamoDB and S3 compatible server. |
| 79 | + |
| 80 | +## Create Tables |
| 81 | +1. Make sure DynamoDB are running as per instructions above. |
| 82 | +2. Make sure you have configured all config parameters. Refer [Configuration](#configuration) |
| 83 | +3. Run `npm run create-tables` to create tables. |
| 84 | + |
| 85 | +## Scripts |
| 86 | +1. Drop DB tables: `npm run drop-tables` |
| 87 | +2. Create DB tables: `npm run create-tables` |
| 88 | +3. Initialize/Clear database: `npm run init-db` |
| 89 | +4. Create Elasticsearch index: `npm run init-es`, or to re-create index: `npm run init-es force` |
| 90 | +5. Seed/Insert data to ES and DB: `npm run seed-data` |
| 91 | +6. View DB table data: `npm run view-db-data <ModelName>`, ModelName can be `Member` |
| 92 | +7. View ES data: `npm run view-es-data` |
| 93 | + |
| 94 | +## Local Deployment |
| 95 | + |
| 96 | +- Install dependencies `npm install` |
| 97 | +- Run lint `npm run lint` |
| 98 | +- Run lint fix `npm run lint:fix` |
| 99 | +- initialize Elasticsearch, create configured Elasticsearch index if not present: `npm run init-es`, |
| 100 | + or re-create the index: `npm run init-es force` |
| 101 | +- Create tables `npm run create-tables` |
| 102 | +- Clear and init db data `npm run init-db` |
| 103 | +- Start app `npm start` |
| 104 | +- App is running at `http://localhost:3000` |
| 105 | + |
| 106 | +## Local Deployment with Docker |
| 107 | + |
| 108 | +To run the app using docker, follow the below steps: |
| 109 | + |
| 110 | +1. Navigate to the directory `docker` |
| 111 | + |
| 112 | +2. Rename the file `sample.api.env` to `api.env` |
| 113 | + |
| 114 | +3. Set the required AWS credentials in the file `api.env`, you may add more env variables to this file if needed |
| 115 | + |
| 116 | +4. Once that is done, run the following command |
| 117 | + |
| 118 | +``` |
| 119 | +docker-compose up |
| 120 | +``` |
| 121 | + |
| 122 | +5. When you are running the application for the first time, It will take some time initially to download the image and install the dependencies |
| 123 | + |
| 124 | + |
| 125 | +## Running tests |
| 126 | + |
| 127 | +### Prepare |
| 128 | +- Various config parameters should be properly set. |
| 129 | +- Start Local services. |
| 130 | +- Create DynamoDB tables. |
| 131 | +- Initialize ES index. |
| 132 | +- No need to seed test data, the tests should run in clean environment, if there are data in ES and DB, then clear them by running |
| 133 | + `npm run init-db` and `npm run init-es force` |
| 134 | + |
| 135 | + |
| 136 | +### Running unit tests |
| 137 | +To run unit tests alone |
| 138 | + |
| 139 | +```bash |
| 140 | +npm run test |
| 141 | +``` |
| 142 | + |
| 143 | +To run unit tests with coverage report |
| 144 | + |
| 145 | +```bash |
| 146 | +npm run test:cov |
| 147 | +``` |
| 148 | + |
| 149 | +### Running integration tests |
| 150 | +To run integration tests alone |
| 151 | + |
| 152 | +```bash |
| 153 | +npm run e2e |
| 154 | +``` |
| 155 | + |
| 156 | +To run integration tests with coverage report |
| 157 | + |
| 158 | +```bash |
| 159 | +npm run e2e:cov |
| 160 | +``` |
| 161 | + |
| 162 | +## Verification |
| 163 | +Refer to the verification document `Verification.md` |
| 164 | + |
| 165 | +## Notes |
| 166 | + |
| 167 | +- all JWT tokens provided in Postman environment file and tests are created in `https://jwt.io` with secret `mysecret` |
| 168 | + |
0 commit comments