Skip to content

Commit 56f7642

Browse files
committed
Merge branch 'dev' into deploy-to-ec2
2 parents aa0553b + c52d6f0 commit 56f7642

File tree

122 files changed

+5830
-2179
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

122 files changed

+5830
-2179
lines changed

.github/CODEOWNERS

+4
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,4 @@
1+
# See https://help.github.com/articles/about-codeowners/
2+
3+
# A Conveyal employee is required to approve PR merges
4+
* @conveyal/employees

.github/issue_template.md

+21
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,21 @@
1+
_**NOTE:** This issue system is intended for reporting bugs and tracking progress in software development. Although this software is licensed with an open-source license, any issue opened here may not be responded to in a timely manner. [Conveyal](https://www.conveyal.com) is unable to provide technical support for custom deployments of this software unless your company has a support contract with us. Please remove this note when creating the issue._
2+
3+
## Observed behavior
4+
5+
Please explain what is being observed within the application here.
6+
7+
## Expected behavior
8+
9+
Please explain what should happen instead.
10+
11+
## Steps to reproduce the problem
12+
13+
Please be as specific as possible.
14+
15+
## Any special notes on configuration used
16+
17+
Please describe any applicable config files that were used
18+
19+
## Version of datatools-server and datatools-ui if applicable (exact commit hash or branch name)
20+
21+
If using this in conjunction with [datatools-ui](https://github.com/conveyal/datatools-ui), this info can be found by clicking on the gear icon on the sidebar.

.github/pull_request_template.md

+14
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,14 @@
1+
### Checklist
2+
3+
- [ ] Appropriate branch selected _(all PRs must first be merged to `dev` before they can be merged to `master`)_
4+
- [ ] Any modified or new methods or classes have helpful JavaDoc and code is thoroughly commented
5+
- [ ] The description lists all applicable issues this PR seeks to resolve
6+
- [ ] The description lists any configuration setting(s) that differ from the default settings
7+
- [ ] All tests and CI builds passing
8+
- [ ] The description lists all relevant PRs included in this release _(remove this if not merging to master)_
9+
- [ ] e2e tests are all passing _(remove this if not merging to master)_
10+
- [ ] Code coverage improves or is at 100% _(remove this if not merging to master)_
11+
12+
### Description
13+
14+
Please explain the changes you made here and, if not immediately obvious from the code, how they resolve any referenced issues. Be sure to include all issues being resolved and any special configuration settings that are need for the software to run properly with these changes. If merging to master, please also list the PRs that are to be included.

.travis.yml

+47-12
Original file line numberDiff line numberDiff line change
@@ -1,44 +1,79 @@
1+
dist: trusty # jdk 8 not available on xenial
12
language: java
2-
jdk:
3-
- oraclejdk8
3+
java:
4+
- oraclejdk8
45
install: true
56
sudo: false
67
# Install mongoDB to perform persistence tests
7-
services: mongodb
8+
services:
9+
- mongodb
10+
- postgresql
11+
addons:
12+
postgresql: 9.6
813
cache:
914
directories:
10-
- "$HOME/.m2"
15+
- $HOME/.m2
16+
- $HOME/.cache/yarn
17+
# Install semantic-release
18+
before_script:
19+
- yarn global add @conveyal/maven-semantic-release semantic-release@15
20+
# Create dir for GTFS+ files (used during testing)
21+
- mkdir /tmp/gtfsplus
1122
before_install:
1223
#- sed -i.bak -e 's|https://nexus.codehaus.org/snapshots/|https://oss.sonatype.org/content/repositories/codehaus-snapshots/|g' ~/.m2/settings.xml
1324
# set region in AWS config for S3 setup
1425
- mkdir ~/.aws && printf '%s\n' '[default]' 'aws_access_key_id=foo' 'aws_secret_access_key=bar' 'region=us-east-1' > ~/.aws/config
1526
- cp configurations/default/server.yml.tmp configurations/default/server.yml
27+
# create database for tests
28+
- psql -U postgres -c 'CREATE DATABASE catalogue;'
1629
script:
1730
# package jar
1831
- mvn package
1932
after_success:
20-
# Upload coverage reports to codecov.io
21-
- bash <(curl -s https://codecov.io/bash)
22-
# notify slack channel of build status
33+
# this first codecov run will upload a report associated with the commit set through Travis CI environment variables
34+
- bash <(curl -s https://codecov.io/bash)
35+
# run maven-semantic-release to potentially create a new release of datatools-server. The flag --skip-maven-deploy is
36+
# used to avoid deploying to maven central. So essentially, this just creates a release with a changelog on github.
37+
#
38+
# If maven-semantic-release finishes successfully and the current branch is master, upload coverage reports for the
39+
# commits that maven-semantic-release generated. Since the above codecov run is associated with the commit that
40+
# initiated the Travis build, the report will not be associated with the commits that maven-semantic-release performed
41+
# (if it ended up creating a release and the two commits that were a part of that workflow). Therefore, if on master
42+
# codecov needs to be ran two more times to create codecov reports for the commits made by maven-semantic-release.
43+
# See https://github.com/conveyal/gtfs-lib/issues/193.
44+
#
45+
# The git commands get the commit hash of the HEAD commit and the commit just before HEAD.
46+
- |
47+
semantic-release --prepare @conveyal/maven-semantic-release --publish @semantic-release/github,@conveyal/maven-semantic-release --verify-conditions @semantic-release/github,@conveyal/maven-semantic-release --verify-release @conveyal/maven-semantic-release --use-conveyal-workflow --dev-branch=dev --skip-maven-deploy
48+
if [[ "$TRAVIS_BRANCH" = "master" ]]; then
49+
bash <(curl -s https://codecov.io/bash) -C "$(git rev-parse HEAD)"
50+
bash <(curl -s https://codecov.io/bash) -C "$(git rev-parse HEAD^)"
51+
fi
2352
notifications:
53+
# notify slack channel of build status
2454
slack: conveyal:WQxmWiu8PdmujwLw4ziW72Gc
2555
before_deploy:
2656
# get branch name of current branch for use in jar name: https://graysonkoonce.com/getting-the-current-branch-name-during-a-pull-request-in-travis-ci/
2757
- export BRANCH=$(if [ "$TRAVIS_PULL_REQUEST" == "false" ]; then echo $TRAVIS_BRANCH; else echo $TRAVIS_PULL_REQUEST_BRANCH; fi)
28-
# copy packaged jars over to deploy dir
58+
# Create directory that will contain artifacts to deploy to s3.
2959
- mkdir deploy
60+
# Display contents of target directory (for logging purposes only).
61+
- ls target/*.jar
62+
# Copy packaged jars over to deploy dir.
3063
- cp target/dt-*.jar deploy/
31-
- cp "target/dt-$(git describe --always).jar" "deploy/dt-latest-${BRANCH}.jar"
64+
# FIXME: Do not create a branch-specific jar for now. Having a jar that changes contents but keeps the same name
65+
# may cause confusion down the road and may be undesirable.
66+
# - cp "target/dt-$(git describe --always).jar" "deploy/dt-latest-${BRANCH}.jar"
3267
deploy:
3368
provider: s3
3469
skip_cleanup: true
35-
access_key_id: AKIAJISY76KTZBNHS4SA
70+
access_key_id: AKIAIWMAQP5YXWT7OZEA
3671
secret_access_key:
37-
secure: a2PNYiv7kzgKxfSx6IhZxSCFBZTCjrbIAK/vmCB1KcpnlV4vTt/IL13i3u6XC8wAbUxhd6iJMtVRm4diIwmy0K7nnpp0h3cQDxYqWCmf1dHZWBJXkpurDpbfxW5G6IlL14i+EsTSCpmwalov+atOBDVyJWVGqfEYaj9c6Q1E0fiYNP3QwZQcsVuD1CRw91xzckfERwqYcz70p/hmTEPOgUwDHuyHsjFafJx+krY3mnBdRdDRLcnPavjcEtprjGkdiVbNETe3CHVNQrAVfqm187OoDA2tHTPjTFmlAdUedp4rYqLmF/WWbHZLzUkQb95FJkklx30vlwC0bIutP1TwIlr3ma5aCRFc58x3SzG07AeM+vbt/nh5A52cpdRjBnhctC2kL++QvwkJhwRy2xptl/WEd5AUagoN4ngnGzyDS4kk/taQFL0IAav5C2WH668kGyH17KNeWG/bCDd55oCvwNlppAYXH+WdbtylqiVb9Fllvs1wcIYWqqyX5zdYiyFEI8LyEQsNF/D5ekuAtLXcF25uwjNtHMjdAxQxHbAbBOeaaLwJd29os9GrKFI/2C0TVXZo2zaFLZyFaIsDHqAC+MXDBDtktimC9Uuozz7bXENCrOUBfsDEQXb46tkXLGaQNXeOhe3KwVKxlGDCsLb7iHIcdDyBm19hqUWhU3uA+dU=
72+
secure: cDfIv+/+YimqsH8NvWQZy9YTqaplOwlIeEK+KEBCfsJ3DJK5sa6U4BMZCA4OMP1oTEaIxkd4Rcvj0OAYSFQVNQHtwc+1WeHobzu+MWajMNwmJYdjIvCqMFg2lgJdzCWv6vWcitNvrsYpuXxJlQOirY/4GjEh2gueHlilEdJEItBGYebQL0/5lg9704oeO9v+tIEVivtNc76K5DoxbAa1nW5wCYD7yMQ/cc9EQiMgR5PXNEVJS4hO7dfdDwk2ulGfpwTDrcSaR9JsHyoXj72kJHC9wocS9PLeeYzNAw6ctIymNIjotUf/QUeMlheBbLfTq6DKQ0ISLcD9YYOwviUMEGmnte+HCvTPTtxNbjBWPGa2HMkKsGjTptWu1RtqRJTLy19EN1WG5znO9M+lNGBjLivxHZA/3w7jyfvEU3wvQlzo59ytNMwOEJ3zvSm6r3/QmOr5BU+UHsqy5vv2lOQ9Nv10Uag11zDP1YWCoD96jvjZJsUZtW80ZweHYpDMq0vKdZwZSlbrhgHzS7vlDW7llZPUntz0SfKCjtddbRdy6T4HgsmA8EsBATfisWpmFA6roQSnYwfEZ5ooJ8IMjfOm1qGphrP1Qv8kYkqdtOyTijYErqJ3YzldjeItqaWtyD5tmHm6Wmq6XIbw4bnSfGRx9di+cG5lDEPe1tfBPCf9O5M=
3873
# upload jars in deploy dir to bucket
3974
bucket: datatools-builds
4075
local-dir: deploy
4176
acl: public_read
4277
on:
43-
repo: catalogueglobal/datatools-server
78+
repo: ibi-group/datatools-server
4479
all_branches: true

configurations/default/env.yml.tmp

+4-1
Original file line numberDiff line numberDiff line change
@@ -1,10 +1,13 @@
1+
# This client ID refers to the UI client in Auth0.
12
AUTH0_CLIENT_ID: your-auth0-client-id
23
AUTH0_DOMAIN: your-auth0-domain
34
# Note: One of AUTH0_SECRET or AUTH0_PUBLIC_KEY should be used depending on the signing algorithm set on the client.
45
# It seems that newer Auth0 accounts (2017 and later) might default to RS256 (public key).
56
AUTH0_SECRET: your-auth0-secret # uses HS256 signing algorithm
67
# AUTH0_PUBLIC_KEY: /path/to/auth0.pem # uses RS256 signing algorithm
7-
AUTH0_TOKEN: your-auth0-token
8+
# This client/secret pair refer to a machine-to-machine Auth0 application used to access the Management API.
9+
AUTH0_API_CLIENT: your-api-client-id
10+
AUTH0_API_SECRET: your-api-secret-id
811
DISABLE_AUTH: false
912
OSM_VEX: http://localhost:1000
1013
SPARKPOST_KEY: your-sparkpost-key

configurations/default/server.yml.tmp

+6
Original file line numberDiff line numberDiff line change
@@ -15,6 +15,9 @@ modules:
1515
enabled: false
1616
user_admin:
1717
enabled: true
18+
# Enable GTFS+ module for testing purposes
19+
gtfsplus:
20+
enabled: true
1821
gtfsapi:
1922
enabled: true
2023
load_on_fetch: false
@@ -29,3 +32,6 @@ extensions:
2932
enabled: true
3033
api: http://api.transitfeeds.com/v1/getFeeds
3134
key: your-api-key
35+
# Enable MTC for testing purposes
36+
mtc:
37+
enabled: true

jmeter/README.md

+8-4
Original file line numberDiff line numberDiff line change
@@ -31,8 +31,8 @@ The test plan can be ran straight from the command line. A helper script is pro
3131
| 1 | test plan mode | `batch`, `fetch`, `query` or `upload` | which test plan mode to use when running the jmeter script. (see notes below for more explanation of these test plan modes) |
3232
| 2 | number of threads | an integer greater than 0 | The number of simultaneous threads to run at a time. The threads will have staggered start times 1 second apart. |
3333
| 3 | number of loops | an integer greater than 0 | the number of loops to run. This is combined with the number of threads, so if the number of threads is 10 and the number of loops is 8, the total number of test plans to run will be 80. |
34-
| 4 | project name or batch csv file | string of the project name or string of file path to batch csv file | This argument is required if running the script with the `batch` test plan mode, otherwise, this argument is optional. The jmeter script will create new projects with a project name plus the current iteration number. The default name is "test project #". Also, if the s3 bucket argument is also provided, the output folder will be tarred up and with this name. |
35-
| 5 | s3 bucket | string of an s3 bucket | OPTIONAL. If provided, the script will tar up the output folder and attempt to upload to the specified s3 bucket. This assumes that aws credentials have been setup for use by the `aws` command line tool. |
34+
| 4 | project name or batch csv file | string of the project name or string of file path to batch csv file | This argument is required if running the script with the `batch` test plan mode, otherwise, this argument is optional.<br><br>If in `fetch` or `upload` mode, the jmeter script will create new projects with a the provided project name (or "test project" if a name is not provided) plus the current iteration number. In `fetch` or `upload` mode, the feed url and upload file is not configurable. In `fetch` mode, the url `http://documents.atlantaregional.com/transitdata/gtfs_ASC.zip` will be used to fetch the feed to create the feed version. In `upload` mode, the file `fixtures/gtfs.zip` will be uploaded to create the feed version.<br><br>If in `query` mode, jmeter will try to find the project matching the provided name (as long as the project name is not "test project") or a random project will be picked if this argument is not provided. |
35+
| 5 | s3 bucket | string of an s3 bucket | OPTIONAL. If provided, the script will tar up the output folder and attempt to upload to the specified s3 bucket. This assumes that aws credentials have been setup for use by the `aws` command line tool. If not running in batch mode and a project name has been specified, the name of this file will be `{project name}.tar.gz`. Otherwise, the name will be `output.tar.gz`. |
3636

3737
Examples:
3838

@@ -48,7 +48,7 @@ _Run the test plan in query mode 80 total times in 10 threads each completing 8
4848

4949
_Run in batch mode. Note that all feeds in the csv file will be processed in each loop. So in the following command, each feed in the batch.csv file would be processed 6 times. See the section below for documentation on the csv file and also see the fixtures folder for an example file._
5050
```sh
51-
./run-tests.sh query 3 2 batch.csv my-s3-bucket
51+
./run-tests.sh batch 3 2 batch.csv my-s3-bucket
5252
```
5353

5454
### Running the upload test on multiple gtfs files
@@ -124,6 +124,8 @@ This section is run under the `query` test plan mode. This script assumes that
124124

125125
This section is run in all test plan modes.
126126

127+
1. Fetch stops and a row count of stops
128+
1. Make sure the number of stops matches the row count of stops
127129
1. Fetch all routes
128130
1. Pick a random route
129131
1. Fetch all trips on selected route
@@ -133,11 +135,13 @@ This section is run in all test plan modes.
133135
1. Fetch embedded stop_times from trips from a random pattern
134136
1. Check that all stop_times have proper trip_id
135137
1. Check that all stop_times in trips on pattern have same stop sequence as pattern
138+
1. Make a GraphQL request that contains a nested query of routes, patterns and stops
139+
1. Make sure that each route is present in the route within the list of patterns
136140

137141
## Reporting
138142

139143
If running this script in GUI mode, it is possible to see all results in real-time by viewing the various listeners at the end of the thread group.
140144

141145
When running the test plan from the command line in non-gui mode, reports will be saved to the `output` folder. The outputs will contain a csv file of all requests made and an html report summarizing the results. If the test plan mode was `batch`, `fetch` or `upload` than another csv file will be written that contains a list of the elapsed time for processing the creation of a new gtfs feed version.
142146

143-
The csv files can be loaded into a jmeter GUI listener to view more details.
147+
The csv files can be loaded into a jmeter GUI to view more details.

0 commit comments

Comments
 (0)