-
Notifications
You must be signed in to change notification settings - Fork 290
Running integration tests with mock hardware #1226
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
Codecov Report✅ All modified and coverable lines are covered by tests. Additional details and impacted files@@ Coverage Diff @@
## main #1226 +/- ##
==========================================
+ Coverage 3.59% 18.56% +14.97%
==========================================
Files 13 33 +20
Lines 947 3409 +2462
Branches 152 421 +269
==========================================
+ Hits 34 633 +599
- Misses 843 2745 +1902
+ Partials 70 31 -39
Flags with carried forward coverage won't be shown. Click here to find out more. ☔ View full report in Codecov by Sentry. 🚀 New features to boost your workflow:
|
1c6e90d
to
44cfe0b
Compare
0d259f8
to
1727bd1
Compare
db57a95
to
72ac7ce
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think it is a good idea to split up the robot_driver test into individual tests.
However, I think it would be beneficial to be able to specify mock hardware outside of the test parametrization. This way, we can run the mock hardware tests also when the integration tests flag is not set. This would effectively allow running some tests on the buildfarm, as well.
ac702c4
to
c488e21
Compare
8c02090
to
3cc8a2c
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This seems fine so far and I think with renaming all the tests I would be fine with merging this. There is more potential for refactoring things like the initialization, but for this PR I think it's fine to leave it like that.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Shouldn't this test also be able to run on mock hardware? I see, it doesn't, but I don't see a conceptual reason why it shouldn't.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There is no real reason that it shouldn't, but with the way the controller sets io's, it apparently cannot verify if a pin was set, and fails the test because the pins indeed end up at the wrong value. The controller also prints a little warning that this can happen with mock hardware, so I don't think it should be tested, as it doesn't behave correctly.
ur_robot_driver/test/integration_test_passthrough_controller.py
Outdated
Show resolved
Hide resolved
ur_robot_driver/test/integration_test_passthrough_controller.py
Outdated
Show resolved
Hide resolved
ur_robot_driver/test/integration_test_scaled_joint_controller.py
Outdated
Show resolved
Hide resolved
ur_robot_driver/test/integration_test_scaled_joint_controller.py
Outdated
Show resolved
Hide resolved
…)" This reverts commit 341506a.
)" This reverts commit 9959221.
Added some launch arguments and some checks to avoid connecting to the dashboard interface when using mock hardware, as that will not work. Maybe not the most elegant, but it works for now. Currently passthrough controller does not work at all with mock hardware, and is bypassed if using that in the test. test_trajectory_scaled_aborts_on_violation fails, as the hardware doesnt abort. test_set_io also fails as the controller cant verify that a pin has been set.
And into separate test cases. Moved timout for trajectory execution to common file
…t can be used by mock hardware test as well.
Co-authored-by: Felix Exner <[email protected]>
7eaf5a8
to
3201f73
Compare
Added some launch arguments and some checks to avoid connecting to the dashboard interface when using mock hardware, as that will not work. Maybe not the most elegant, but it works for now. Currently passthrough controller does not work at all with mock hardware, and is bypassed if using that in the test. test_trajectory_scaled_aborts_on_violation fails, as the hardware doesnt abort. test_set_io also fails as the controller cant verify that a pin has been set.