Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Extending the WFS-Probe request to handle the count parameter. #469

Open
wants to merge 5 commits into
base: master
Choose a base branch
from

Conversation

nollpa
Copy link

@nollpa nollpa commented Apr 10, 2024

max_count added to the PARAM_DEFS with a default of 1000.

max_count added to the PARAM_DEFS with a default of 1000.
@justb4 justb4 added this to the Version 0.9.1 milestone Apr 10, 2024
Copy link
Member

@justb4 justb4 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@nollpa Thanks for this quick action! Unit tests are automatically run. We have one failure: a WFS test on WFS from the Dutch national GDI: https://service.pdok.nl/lv/bag/wfs/v2_0?service=WFS&version=1.1.0&request=GetCapabilities (buildings and addresses).

As I was already wondering looking at the code, the WFS Request Template: before WFS 2.0.0 the WFS parameter was called "maxfeatures", for 2.0.0 and higher "count". It depends a bit on the WFS implementation (GeoServer, MapServer, deegree) what is accepted, some WFS-es, also through configuration, can be more "forgiving".

The Probe request template uses WFS 1.1.0, so that would require maxfeatures, but the most optimal would be to support multiple WFS versions like the GetCapabilities Probe.

Quickest way now is to use maxfeatures, see if that works and then extend with new PRs. I also see room now for a plain GetFeature Probe, even as GET, with max_count. Filling in a BBOX is always tricky, default is entire bbox of datatype..

Another suggestion: the default of 1000 is a bit high, some features like GML appschema maybe be very big, and does not add to validation, suggestion is to use 5.

@nollpa
Copy link
Author

nollpa commented Apr 10, 2024

@justb4
I've pushed two new commits to my branch (maxfeatures instead of count and default value 5).
If this works, I will have a look for a possible implemetation of using different WFS versions, as you mentioned.

@justb4
Copy link
Member

justb4 commented Apr 10, 2024

The unit test still fails, but I think/hope for a different reason: the max_count parameter has required: True.
Maybe that makes the test fail. Or the testdata requires that parameter, see this data which is filled in during unit test:

https://github.com/geopython/GeoHealthCheck/blob/master/tests/data/resources.json#L196

Maybe adding max_count to the parameters will make the test succeed,
but also in the light of upgrades to existing installs with WFS GetFeature Probes it would be better to be forward compatible. It would be very hard to solve with an Alembic/DB upgrade as Probe parameters are stored in a free text-column...

Maybe first see what happens if required: False is set, and maybe add an extra test (I can help, it is just a change in the json above) with/without max_count.

@nollpa nollpa requested a review from justb4 April 11, 2024 06:35
@nollpa
Copy link
Author

nollpa commented Apr 11, 2024

I've ran the tests on my local environment. After my newest changes, the tests regarding the WFS-Resources are fine. But I got an error on the LDPROXY FEATURES test. I got that error on master branch, too. So maybe it is a local problem? Does it pass successfully in your environment?

The max_count paramter must be set within the fixtures.json. Without, it raises an error for "missing key 'max_count'". No matter if the parameter is set to required or not. This is caused by the mismatching keys during self.REQUEST_TEMPLATE.format(**request_parms).

What is resources.json used for? I don't found a place of usage.

@justb4
Copy link
Member

justb4 commented Apr 11, 2024

All unit tests (at least from the GitHUb CI) now pass!

I've ran the tests on my local environment. After my newest changes, the tests regarding the WFS-Resources are fine. But I got an error on the LDPROXY FEATURES test. I got that error on master branch, too. So maybe it is a local problem? Does it pass successfully in your environment?

I would not worry about this. It passes in GH CI. General problem: relying on external services for tests. On the other hand we like to test with as many (in this OGC API Features) implementations as possible.

The max_count paramter must be set within the fixtures.json. Without, it raises an error for "missing key 'max_count'". No matter if the parameter is set to required or not. This is caused by the mismatching keys during self.REQUEST_TEMPLATE.format(**request_parms).

Hmm, required: False I see now, is only used in a few probes, like wms for STYLES={style} which is allowed to be empty. What I would expect with required: False and default: 5, is that the default is applied.
What I suggest is to also add a value: field, is a bit of an experiment with the default. That value, which can be overridden in the UI Probe form or from values in DB (or the .json test filed) will be filled in self.REQUEST_TEMPLATE.format(**request_parms). I expect. We use value for fixed value fields like service={service} to have reusable templates. So:

        'max_count': {
            'type': 'string',
            'description': 'Maximum amount of features to select',
            'default': '5',
            'value': '5',
            'required': False,
            'range': None
        },

Alternatively, a bit more work, is to adapt expand_params(), and to use the default to add/fill in a value. But first try the value: attribute.
Then I would also add test with and without max_count.

What is resources.json used for? I don't found a place of usage.

fixtures.json and resources.json are in effect 'fixtures' to populate a DB for subsequent test execution. Maybe more white-box than unit tests, but this provides a very quick way to maintain tests without writing explicit Python code.

But I now see only fixtures.json is used, and that resources.json nowhere indeed, is somewhat equal, maybe a dangling renaming issue. Ignore resources.json , we may delete.
test_resources.py .testRunResoures() will basically run all tests in the DB, store and check results. Very compact.

Focus only on fixtures.json now. I suggest to add two new tests in fixtures.json without max_count.

Thanks for patience!

@tomkralidis
Copy link
Member

Hi all: any update on this issue?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants