Skip to content

Conversation

@MGAMZ
Copy link
Contributor

@MGAMZ MGAMZ commented Oct 26, 2025

Motivation

This PR is a sub-PR of #1665.

According to https://numpy.org/devdocs/release/2.0.0-notes.html#deprecations, np.compat has been deprecated, as Python 2 is no longer supported.

This causes several failures in pytest: tests/test_config/test_lazy.py. Fixing this issue now.

Modification

test_lazy.py utilize numpy.compat as an example numpy module to test if the lazy import function can properly working. So simply replacing it with any other existing numpy module, numpy.fft for this PR.

@MGAMZ MGAMZ closed this Oct 26, 2025
@MGAMZ MGAMZ force-pushed the dev/Contri_251026 branch from 0a07aad to aede826 Compare October 26, 2025 02:16
@MGAMZ MGAMZ changed the title Make #1665 suitable for merge. [Fix] Deprecated numpy.compact fails the pytest Oct 26, 2025
@MGAMZ MGAMZ changed the title [Fix] Deprecated numpy.compact fails the pytest [Fix] Deprecated numpy.compact fails pytest Oct 26, 2025
@MGAMZ MGAMZ reopened this Oct 26, 2025
@MGAMZ MGAMZ changed the title [Fix] Deprecated numpy.compact fails pytest [Fix] Removed numpy.compact fails pytest Oct 26, 2025
@MGAMZ
Copy link
Contributor Author

MGAMZ commented Oct 26, 2025

When running pytest locally with single GPU, there are currently the following pytest errors.

======================================================== short test summary info =========================================================
FAILED tests/test_config/test_config.py::TestConfig::test_dump - RuntimeError
FAILED tests/test_config/test_config.py::TestConfig::test_deepcopy - RuntimeError
FAILED tests/test_config/test_config.py::TestConfig::test_copy - RuntimeError
FAILED tests/test_config/test_config.py::TestConfig::test_lazy_import - RuntimeError
FAILED tests/test_config/test_lazy.py::TestImportTransformer::test_lazy_module - ModuleNotFoundError: Failed to import numpy in None, line 5 for No module named 'numpy.compat'
FAILED tests/test_fileio/test_fileclient.py::TestFileClient::test_http_backend[http-None] - urllib.error.URLError: <urlopen error [Errno 111] Connection refused>
FAILED tests/test_fileio/test_fileclient.py::TestFileClient::test_http_backend[None-http] - urllib.error.URLError: <urlopen error [Errno 111] Connection refused>
FAILED tests/test_hooks/test_sync_buffers_hook.py::TestSyncBuffersHook::test_sync_buffers_hook - RuntimeError: Process 1 exited with error code 10 and exception:
FAILED tests/test_hooks/test_sync_buffers_hook.py::TestSyncBuffersHook::test_with_runner - RuntimeError: Process 1 exited with error code 10 and exception:
FAILED tests/test_logging/test_message_hub.py::TestMessageHub::test_get_scalars - Failed: DID NOT RAISE <class 'AssertionError'>
FAILED tests/test_model/test_model_utils.py::test_convert_syncbn - ValueError: Default process group has not been initialized, please make sure to call init_process_group.
FAILED tests/test_optim/test_optimizer/test_optimizer.py::TestBuilder::test_build_optimizer - AttributeError: 'str' object has no attribute 'update'
FAILED tests/test_optim/test_optimizer/test_optimizer_wrapper.py::TestAmpOptimWrapper::test_init - AssertionError: <torch.amp.grad_scaler.GradScaler object at 0x7f96d4eb6570> is not an instance of <class 'torch.cuda.amp.grad_scaler....
FAILED tests/test_runner/test_runner.py::TestRunner::test_build_optim_wrapper - AttributeError: 'str' object has no attribute 'update'
FAILED tests/test_runner/test_runner.py::TestRunner::test_checkpoint - AttributeError: 'str' object has no attribute 'update'
FAILED tests/test_runner/test_runner.py::TestRunner::test_train - AttributeError: 'str' object has no attribute 'update'
FAILED tests/test_strategies/test_fsdp.py::TestStrategy::test_run_strategy - torch.multiprocessing.spawn.ProcessRaisedException: 
FAILED tests/test_utils/test_package_utils.py::test_get_install_path - importlib.metadata.PackageNotFoundError: No package metadata was found for unknown
FAILED tests/test_utils/test_timer.py::test_timer_run - assert 0.03835725784301758 < 0.03
FAILED tests/test_utils/test_timer.py::test_timer_context - AssertionError: assert 'time: 1.1s\n' == 'time: 1.0s\n'
FAILED tests/test_visualizer/test_vis_backend.py::TestAimVisBackend::test_experiment - ImportError: Please run "pip install aim" to install aim
FAILED tests/test_visualizer/test_vis_backend.py::TestAimVisBackend::test_add_config - ImportError: Please run "pip install aim" to install aim
FAILED tests/test_visualizer/test_vis_backend.py::TestAimVisBackend::test_add_image - ImportError: Please run "pip install aim" to install aim
FAILED tests/test_visualizer/test_vis_backend.py::TestAimVisBackend::test_add_scalar - ImportError: Please run "pip install aim" to install aim
FAILED tests/test_visualizer/test_vis_backend.py::TestAimVisBackend::test_add_scalars - ImportError: Please run "pip install aim" to install aim
FAILED tests/test_visualizer/test_vis_backend.py::TestAimVisBackend::test_close - ImportError: Please run "pip install aim" to install aim
================================= 26 failed, 907 passed, 89 skipped, 2994 warnings in 628.08s (0:10:28) ==================================

I'm trying to fix these errors.

@MGAMZ
Copy link
Contributor Author

MGAMZ commented Oct 26, 2025

Now test_lazy is fixed.

python3 -m pytest tests/test_config/test_lazy.py -v
=================================================== test session starts ===================================================
platform linux -- Python 3.13.9, pytest-8.4.1, pluggy-1.6.0 -- /home/mgam/miniforge3/envs/pt/bin/python3
cachedir: .pytest_cache
rootdir: /home/mgam/mgam_repos/mmengine
configfile: pytest.ini
plugins: anyio-4.9.0, hydra-core-1.3.2
collected 4 items                                                                                                         

tests/test_config/test_lazy.py::TestImportTransformer::test_lazy_module PASSED                                      [ 25%]
tests/test_config/test_lazy.py::TestLazyObject::test_build PASSED                                                   [ 50%]
tests/test_config/test_lazy.py::TestLazyObject::test_init PASSED                                                    [ 75%]
tests/test_config/test_lazy.py::TestLazyAttr::test_build PASSED                                                     [100%]

==================================================== 4 passed in 0.15s ====================================================

@MGAMZ MGAMZ marked this pull request as ready for review October 26, 2025 03:43
Copilot AI review requested due to automatic review settings October 26, 2025 03:43
Copy link
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull Request Overview

This PR fixes pytest failures caused by the deprecated numpy.compat module by replacing it with numpy.fft in test files. NumPy 2.0 removed the compat module as Python 2 is no longer supported.

Key Changes:

  • Replaced numpy.compat imports with numpy.fft in test files
  • Updated test assertions and variable names to reference fft instead of compat
  • Updated code comments to reflect the module name change

Reviewed Changes

Copilot reviewed 2 out of 2 changed files in this pull request and generated no comments.

File Description
tests/test_config/test_lazy.py Replaced all references to numpy.compat with numpy.fft in imports, assertions, and variable names
tests/data/config/lazy_module_config/test_ast_transform.py Updated import statement from numpy.compat to numpy.fft

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

@MGAMZ
Copy link
Contributor Author

MGAMZ commented Oct 26, 2025

@HAOCHENYE This one is ready to be reviewed.

@MGAMZ MGAMZ changed the title [Fix] Removed numpy.compact fails pytest [Test] Removed numpy.compact in numpy>=2.0.0 Oct 26, 2025
@HAOCHENYE HAOCHENYE merged commit 7e0aff5 into open-mmlab:main Oct 26, 2025
6 of 15 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants