Scripts for use within Modern Tools For Housing
Provides utils to facilitate connecting to and scripting against:
- DynamoDB
- RDS (with ORM bindings)
- Elasticsearch
- Ensure you have a recent version of Python 3 (e.g. Python 3.11)
- Ensure you have Pip package manager for Python 3. Try
python3 -m pip --version
to see if you have it installed. If not, trypython3 -m ensurepip
to install it. - Make a venv (local package directory) with
python3 -m venv venv
- Activate the venv
- Linux / MacOS: Run
source venv/bin/activate
- Windows: Run
./venv/Scripts/activate.bat
or./venv/Scripts/activate.ps1
depending on what's available
- Linux / MacOS: Run
- Optionally verify the venv is active:
- Linux / MacOS: Run
echo $VIRTUAL_ENV
and check it points to the venv - Windows: Run
echo %VIRTUAL_ENV%
and check it points to the venv
- Linux / MacOS: Run
- Run
python3 -m pip install -r requirements.txt
in the root directory of the repository to install all requirements into the venv.
This will simplify setup and install various useful tools.
You must clone and use this repository in WSL for acceptable performance. You must mount your AWS directory from your WSL filesystem, not your Windows filesystem. If you experience issues, consider setting up without devcontainers. This is just a limitation of Docker on Windows with no clear fix.
- Ensure you have Docker Desktop or Docker Engine installed with the Docker daemon active
- Ensure you have the Dev Containers extension in VSCode or equivalent in your favourite IDE
- Select the prompt to "Reopen in Container" or equivalent when it appears or access it from the command menu (Ctrl+Shift+P)
Note: First setup may take several minutes to build the Docker container, but subsequent builds will only be a few seconds. The common rules of Docker apply where it is built in layers from .devcontainer/Dockerfile and caches layers where possible.
Note: Your ~/.aws directory will be mounted into the Docker container if you're using a devcontainer, so you can set up your AWS credentials on your host machine and access them in the container.
Install the AWS CLI - the scripts use AWS CLI profile based authentication to make calls to AWS.
You'll need a recent version of the AWS CLI to make connections to databases via port forwarding as this is a new feature.
You can follow these steps multiple times to set up different profiles for different AWS accounts:
- Go to the Google SSO AWS start page and select "Command line or programmatic access" for the account needed.
- Note the steps under AWS IAM Identity Center credentials (Recommended) and use those details below.
- Run
aws configure sso
and follow the prompts to set up your AWS credentials. During this process, pay attention to setting the correct values forsso_region
andcli default client region
as these can be different. For thecli profile name
value, refer to the Stage enum inenums/enums.py
as it must exactly match one of these. Add to the enum as needed for different profiles. - To refresh your credentials when they expire, run
aws sso login --profile {profile_name}
You will need to refresh your credentials as in step 4 when commands fail, usually with a message
like Error when retrieving token from sso: Token has expired and refresh failed
You should not need further dependencies to run any scripts.
Open a script and hit the play button in VSCode or your IDE to debug it - on VSCode this will use the debug configuration in the .vscode/launch.json file.
Python convention is to create a main
function in a file and call it inside an if __name__ == "__main__":
block at the end of the file.