Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Implement cleanup after every test in extension #27278

Open
9 tasks
hjetpoluru opened this issue Sep 19, 2024 · 0 comments
Open
9 tasks

Implement cleanup after every test in extension #27278

hjetpoluru opened this issue Sep 19, 2024 · 0 comments

Comments

@hjetpoluru
Copy link
Contributor

hjetpoluru commented Sep 19, 2024

What is this about?

Below are the thoughts of @davidmurdoch about the testing framework

the ganache errors here are likely due to our Worst Practice ™️ testing methodology[1] (documented here https://github.com/MetaMask/contributor-docs/blob/main/docs/unit-testing.md#avoid-the-use-of-beforeeach). I outline why it's bad here: #25153 (comment) and some conversation about it on slack here: https://consensys.slack.com/archives/C1L7H42BT/p1720042513020929

some additional conversation about this problem: #25153 (comment)

The current testing framework for the MetaMask extension does not include a cleanup process after each test, which has led to issues such as flaky tests and inconsistent test results. Implementing a cleanup process after every test is a significant framework change that aims to improve the reliability and stability of the test suite.

This change involves resetting the test environment to a clean state after each test execution, ensuring that tests do not interfere with each other and that the results are consistent and reproducible. This will help in identifying and isolating issues more effectively, reducing the occurrence of flaky tests, and improving overall test coverage.

The implementation of this cleanup process will require modifications to the existing test framework, including:

  • Adding setup and teardown methods to reset the state before and after each test.
  • Ensuring that all dependencies and resources are properly cleaned up.
  • Updating existing tests to comply with the new framework changes.

Scenario

No response

Design

Here's how I did it The Right Way ™️ recently: https://github.com/MetaMask/metamask-extension/blob/develop/test/e2e/tests/phishing-controller/phishing-detection.spec.js#L371-L430
It's gross though.

Technical Details

No response

Threat Modeling Framework

No response

Acceptance Criteria

No response

Stakeholder review needed before the work gets merged

  • Engineering (needed in most cases)
  • Design
  • Product
  • QA (automation tests are required to pass before merging PRs but not all changes are covered by automation tests - please review if QA is needed beyond automation tests)
  • Security
  • Legal
  • Marketing
  • Management (please specify)
  • Other (please specify)

References

No response

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

1 participant