Skip to content

Running Python longhaul on Horton

Bert Kleewein edited this page Jun 29, 2020 · 2 revisions

The details

  • Horton only runs on Python 3.7+
  • Horton tests are async/await, but glue allows us to test callback clients by wrapping them in async wrappers.
  • We can test Python3.7+ by loading azure-iot-device directy in the pytest process (called python_inproc)
  • To test 2.7 to 3.6, we need to load azure-iot-device into a host process with a REST interface. This is most commonly inside a Docker container.
  • We can run python_inproc tests on Windows or Linux.
  • We can run REST-based tests on Linux

Step 1: (Linux only): install/build Python 3.7

If you have a favorite method for installing this (conda, etc), use this. If not, you can run source scripts/setup/build_python.sh. This installs the GNU toolchain, builds Python 3.7 (using pyenv), and creates a virtual env (using virtualenv) to run it.

Step 2: Activate horton environment.

Horton uses a Pythion virtual environment (using virtualenv). If you don't see (horton) in your prompt string, your envionment is not correct.

On Linux, run source bin/activate_horton. This will create the Horton virtual environment, install libraries, and update the path. If successful, you should be greeted with Horton environment activated and your prompt should contain (horton).

(Python-3.7.1) bertk@bertk-hp:~/repos/e2e-fx$ source bin/activate_horton
~/repos/e2e-fx ~/repos/e2e-fx ~/repos/e2e-fx ~
Initializing horton environment
checking for python 3.7+
<<-- snip -->>
installing required python libraries
Horton environment activated
(horton) bertk@bertk-hp:~/repos/e2e-fx$

On Windows, run bin\activate_horton.cmd. This uses nuget to download Python 3.8.1 and virtualenv to make a virtual environment to run Horton in.

(You might need to download nuget.exe from https://www.nuget.org/downloads in order to make this work.)

Again, you should be greeted with Horton environment activated and your prompt should contain (horton).

C:\Users\bertk\repos\e2e-fx>bin\activate_horton
Initializing horton environment
Installing Python 3.8.1
Feeds used:
  https://api.nuget.org/v3/index.json
<<--snip-->
Activating virtual environment
Installing horton libraries
Horton environment activated

(horton) C:\Users\bertk\repos\e2e-fx>

Every time you use Horton, you have to run activate_horton to enter the environment. It will be faster the second time.

Step 3: Install docker (Linux only)

If you want to run tests using Docker containers(for everything except Python 3.7+), you need to install the docker engine and a local container repository. You need to set these environment variables

export IOTHUB_E2E_REPO_USER="REDACTED_GET_FROM_KEYVAULT"
export IOTHUB_E2E_REPO_PASSWORD="REDACTED_GET_FROM_KEYVAULT"
export IOTHUB_E2E_REPO_ADDRESS="REDACTED_GET_FROM_KEYVAULT"

and you need to run docker login to authenticate with the container repository. There's a script for this in scripts/setup/docker_login.sh.

(horton) bertk@bertk-hp:~/repos/e2e-fx/scripts/setup$ ./docker-login.sh
WARNING! Using --password via the CLI is insecure. Use --password-stdin.
WARNING! Your password will be stored unencrypted in /home/bertk/.docker/config.json.
Configure a credential helper to remove this warning. See
https://docs.docker.com/engine/reference/commandline/login/#credentials-store

Login Succeeded
(horton) bertk@bertk-hp:~/repos/e2e-fx/scripts/setup$

Step 4: Install azure-iot-device, etc. (Python only)

If you're testing Python without docker, you need to install the azure-iot-device libraries (and dependencies) into your Horton environment. Go into the root of your Python repo and run python env_setup.py --no_dev. (no_dev is optional -- it just makes it a little faster and removes some warnings.)

(horton) bertk@bertk-hp:~/repos/python$ python env_setup.py --no_dev
Executing: install --upgrade pip
<<--snip-->>
Successfully installed azure-iothub-provisioningserviceclient-1.2.0 coverage-5.1 flake8-3.8.3 mccabe-0.6.1 mock-4.0.2 pycodestyle-2.6.0 pyflakes-2.2.0 pytest-5.4.3 pytest-asyncio-0.14.0 pytest-cov-2.10.0 pytest-mock-1.10.4

(horton) bertk@bertk-hp:~/repos/python$

Step 5: deploy

This is where you create iothub devices, etc, that you're going to test against. First, you need to set IOTHUB_E2E_CONNECTION_STRING to point to your hub. (Use export on Linux and add double quotes around the valyue. You know the drill by now.)

(horton) C:\Users\bertk\repos\e2e-fx>set IOTHUB_E2E_CONNECTION_STRING=HostName=READACTED;SharedAccessKeyName=iothubowner;SharedAccessKey=REDACTED

To run python_inproc, run horton deploy iothub python_inproc

(horton) C:\Users\bertk\repos>horton deploy iothub python_inproc
clearing test_module object
clearing friend_module object
clearing iotedge object
clearing test_device object
clearing leaf_device object
Creating new device on hub bertk-edge.azure-devices.net
creating device machine_name_HLE_572_test_device
creating module machine_name_HLE_572_test_device/testMod

(horton) C:\Users\bertk\repos>

Or, to run in a container, run horton deploy iothub lkg --language=node (or pythonv2, c, csharp, or java). The lkg parameter tells the script to download an image containing the last build that passed nightly tests.

(horton) bertk@bertk-hp:~/repos/e2e-fx$ horton deploy iothub lkg --language=node
Using LKG image: node-e2e-v3:lkg
<<--snip-->>
running [sudo -n docker run -d -p 8099:8080 -p 8199:22 -p 8140:8040 --name testMod --restart=on-failure:10 --cap-add NET_ADMIN --cap-add NET_RAW REDACTED/node-e2e-v3:lkg]
(horton) bertk@bertk-hp:~/repos/e2e-fx$

If using docker, you should see the container running:


(horton) bertk@bertk-hp:~/repos/e2e-fx$ docker ps
CONTAINER ID        IMAGE                                  COMMAND                  CREATED             STATUS              PORTS                                                                            NAMES
f94e37a44749        iotsdke2e.azurecr.io/node-e2e-v3:lkg   "/usr/local/bin/node…"   2 minutes ago       Up 2 minutes        9229/tcp, 0.0.0.0:8199->22/tcp, 0.0.0.0:8140->8040/tcp, 0.0.0.0:8099->8080/tcp   testMod
663b947a5994        registry:2                             "/entrypoint.sh /etc…"   3 weeks ago         Up 3 weeks          0.0.0.0:5000->5000/tcp                                                           registry
(horton) bertk@bertk-hp:~/repos/e2e-fx$

(There are many more ways to run horton deploy.)

Finally, you need to update credentials so the test-runner can find them. Run horton get_credentials. you need to do this no matter how or where you run Horton. Eventually this will be automatic.

(horton) bertk@bertk-hp:~/repos/e2e-fx$ horton get_credentials
Added connection string for test_device device bertk-hp_CGO_248_test_device
Added connection string for test_module module bertk-hp_CGO_248_test_device,testMod
(horton) bertk@bertk-hp:~/repos/e2e-fx$

Step 6: change longhaul parameters

(Details from this point forward are very likely to change in the near future)

Edit test_runner\test_longhaul.py and look for the test config at the top of the file. Currently it looks like this:

desired_node_config = {
    "test_config": {
        "d2c": {
            "enabled": True,
            "interval": 1,
            "ops_per_interval": 10,
            "slow_send_threshold": "0:00:02",
            "slow_send_and_receive_threshold": "0:00:05",
        },
        "scenario": "test_longhaul_d2c_simple",
        "total_duration": "0:05:00",
    }
}

(Ignore the word "node" in the name. It's meaningless.)

This says:

  • We want to run D2C 10 times ever 1 second.
  • If the send (PUBLISH->PUBACK) takes more than 2 seconds, it's considered a "slow send".
  • If it takes more than 5 seconds to do PUBLISH->PUBACK and see the results in EventHub, that's considered a "slow send and receive".
  • The test is set to run for 5 minutes.
  • (The scenario name is meaningless (for now at least))

To run this test, run pytest --scenario=iothub_device-stress -k test_longhaul -s -x

As this runs, you should see reported properties update every 5 seconds, and you should see every telemetry message carrying the "progress" subset of this structure. This structure is defined in longhaul_config.py

For a recent run that went for over 4 days and had lots of problems, the reported properties look like this:

"reported": {
      "progress": {
        "status": "running",
        "total_duration": "4 days, 3:59:59",
        "active_objects": 147,
        "elapsed_time": "2 days, 18:01:55.551033",
        "fiftieth_percentile_send_and_receive_latency": 14.482652,
        "fiftieth_percentile_send_latency": 5.472373,
        "mean_send_and_receive_latency": 15.461572426299414,
        "mean_send_latency": 5.721816537557754,
        "ops_completed": 218560,
        "ops_slow_send": 205677,
        "ops_slow_send_and_receive": 182025,
        "start_time": "2020-06-25 15:21:54.781014",
        "ops_waiting_to_complete": 46139,
        "ops_waiting_to_send": 1
      },
      "test_config": {
        "d2c": {
          "enabled": true,
          "interval": 1,
          "ops_per_interval": 10,
          "slow_send_and_receive_threshold": "0:00:05",
          "slow_send_threshold": "0:00:02"
        },
        "scenario": "test_longhaul_d2c_simple",
        "total_duration": "4 days, 3:59:59"
      },
      "test_stats": {
        "d2c": {
          "fiftieth_percentile_send_and_receive_latency": 14.482652,
          "fiftieth_percentile_send_latency": 5.472373,
          "mean_send_and_receive_latency": 15.461572426299414,
          "mean_send_latency": 5.721816537557754,
          "ops_completed": 218560,
          "ops_slow_send": 205677,
          "ops_slow_send_and_receive": 182025,
          "ops_waiting_to_complete": 46139,
          "ops_waiting_to_send": 1
        }
      },
Clone this wiki locally