Custom LBH solution to be used as a Lambda authorizer for verifying API access tokens.
- .NET Core as a web framework.
- nUnit as a test framework.
-
Notes
- To use this auth solution, the consumers of your API will need to first generate an access token
- More information on token generation can be found in the Token Administration API repository
- Each consumer should provide their access token in an
Authorization
header when making a request
-
Update the serverless.yml file of your API to use a lambda authorizer for authentication by including the following authorizer reference:
functions:
nameOfApi:
...
events:
- http:
path: /{proxy+}
method: ANY
authorizer:
arn: ${ssm:/platform-apis-lambda-authorizer-arn}
type: request
private: true
The auth solution retrieves information about the API based on the user request. This includes:
- API name
- API endpoint name
- HTTP method type
- Environment (API stage)
It then retrieves the token record for the consumer and ensures that the token is not revoked, expired or invalid.
If the token is valid, it compares the request data to the token record to ensure that the given consumer should have access to the API endpoint they are trying to consume.
- Install Docker.
- Install AWS CLI.
- Clone this repository.
- Rename the initial template.
- Open it in your IDE.
The renaming of base-api
into SomethingElseApi
can be done by running a Renamer powershell script. To do so:
- Open the powershell and navigate to this directory's root.
- Run the script using the following command:
.\Renamer.ps1 -apiName My_Api
If your script execution policy prevents you from running the script, you can temporarily bypass that with:
powershell -noprofile -ExecutionPolicy Bypass -file .\Renamer.ps1 -apiName My_Api
Or you can change your execution policy, prior to running the script, permanently with (this disables security so, be cautious):
Set-ExecutionPolicy Unrestricted
After the renaming is done, the script will ask you if you want to delete it as well, as it's useless now - It's your choice.
To serve the application, run it using your IDE of choice, we use Visual Studio CE and JetBrains Rider on Mac.
The application can also be served locally using docker: Build and serve the application. It will be available in the port 3000.
$ make build && make serve
We use a pull request workflow, where changes are made on a branch and approved by one or more other maintainers before the developer can merge into master
branch.
Then we have an automated six step deployment process, which runs in CircleCI.
- Automated tests (nUnit) are run to ensure the release is of good quality.
- The application is deployed to development automatically, where we check our latest changes work well.
- We manually confirm a staging deployment in the CircleCI workflow once we're happy with our changes in development.
- The application is deployed to staging.
- We manually confirm a production deployment in the CircleCI workflow once we're happy with our changes in staging.
- The application is deployed to production.
Our staging and production environments are hosted by AWS. We would deploy to production per each feature/config merged into master
branch.
Using FxCop Analysers
FxCop runs code analysis when the Solution is built.
Both the API and Test projects have been set up to treat all warnings from the code analysis as errors and therefore, fail the build.
However, we can select which errors to suppress by setting the severity of the responsible rule to none, e.g dotnet_analyzer_diagnostic.<Category-or-RuleId>.severity = none
, within the .editorconfig
file.
Documentation on how to do this can be found here.
$ make test
To run database tests locally (e.g. via Visual Studio) the CONNECTION_STRING
environment variable will need to be populated with:
Host=localhost;Database=entitycore;Username=postgres;Password=mypassword"
Note: The Host name needs to be the name of the stub database docker-compose service, in order to run tests via Docker.
- Use nUnit, FluentAssertions and Moq
- Always follow a TDD approach
- Tests should be independent of each other
- Gateway tests should interact with a real test instance of the database
- Test coverage should never go down
- All use cases should be covered by E2E tests
- Optimise when test run speed starts to hinder development
- Unit tests and E2E tests should run in CI
- Test database schemas should match up with production database schema
- Have integration tests which test from the PostgreSQL database to API Gateway
- Record failure logs
- Automated
- Reliable
- As close to real time as possible
- Observable monitoring in place
- Should not affect any existing databases
- Selwyn Preston, Lead Developer at London Borough of Hackney ([email protected])
- Mirela Georgieva, Lead Developer at London Borough of Hackney ([email protected])
- Matt Keyworth, Lead Developer at London Borough of Hackney ([email protected])
- Rashmi Shetty, Product Owner at London Borough of Hackney ([email protected])