Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

When running a test with more than 11 workers in Locust, CSV files are not generated. #1880

Open
sasa-D-soni opened this issue Dec 16, 2024 · 1 comment

Comments

@sasa-D-soni
Copy link

Taurus version: v1.16.29
Python: CPython 3.11.2
OS: 5.10.228-219.884.amzn2.x86_64

When running in Locust's distributed mode, I encountered an issue where the perf_result_csv_origin.csv and perf_result_xml.xml files (config file: reporting:dump-csv, reporting:dump-xml) were not generated when only the number of workers was changed, even though the scenario and parameters remained the same.

  • With up to 10 workers: The CSV and XML files were generated, and the execution completed successfully.
  • With 11 or more workers: The CSV and XML files were not generated, and the execution failed.

Looking at the bzt logs, the following error was recorded during the failed executions:

bzt.ToolError: Empty results, most likely locust-scenario (LocustIOExecutor) failed. Actual reason for this can be found in logs under /var/taurus/logs

It seems there is an issue in the post_process phase. However, after checking all the files under the /var/taurus/logs directory, I could not find anything that would help resolve the issue.

If you have any ideas or suggestions, I would appreciate your input.

A part of the log for execution with 10 workers

[2024-09-05 09:16:50,555 DEBUG Engine.local] Shutdown locust/locust-scenario
[2024-09-05 09:16:51,556 DEBUG Engine.Configuration] Dumping YAML config into /var/taurus/logs/effective.yml
[2024-09-05 09:16:51,564 DEBUG Engine.Configuration] Dumping JSON config into /var/taurus/logs/effective.json
[2024-09-05 09:16:51,564 INFO Engine] Post-processing...
[2024-09-05 09:16:51,564 DEBUG Engine.local] Post-process locust/locust-scenario
[2024-09-05 09:16:51,564 DEBUG Engine.consolidator] Consolidator buffer[2]: dict_keys([1725527805, 1725527806])
[2024-09-05 09:16:51,564 DEBUG Engine.consolidator] Merging into 1725527805
[2024-09-05 09:16:51,564 DEBUG Engine.consolidator] Bypassing consolidation because of single result
[2024-09-05 09:16:51,566 DEBUG Engine.consolidator] Processed datapoint: 1725527805/ConsolidatingAggregator@139739848636688
[2024-09-05 09:16:51,566 DEBUG Engine.consolidator] Merging into 1725527806
[2024-09-05 09:16:51,566 DEBUG Engine.consolidator] Bypassing consolidation because of single result
[2024-09-05 09:16:51,567 DEBUG Engine.consolidator] Processed datapoint: 1725527806/ConsolidatingAggregator@139739848636688
[2024-09-05 09:16:51,567 INFO Engine.final-stats] Test duration: 0:00:36
[2024-09-05 09:16:51,568 INFO Engine.final-stats] Samples count: 87, 24.14% failures
[2024-09-05 09:16:51,568 INFO Engine.final-stats] Average times: total 0.038, latency 0.000, connect 0.000

A part of the log for execution with 11 workers

[2024-12-16 09:21:40,697 DEBUG Engine.local] Shutdown locust/locust-scenario
[2024-12-16 09:21:41,698 DEBUG Engine.Configuration] Dumping YAML config into /var/taurus/logs/effective.yml
[2024-12-16 09:21:41,706 DEBUG Engine.Configuration] Dumping JSON config into /var/taurus/logs/effective.json
[2024-12-16 09:21:41,707 INFO Engine] Post-processing...
[2024-12-16 09:21:41,707 DEBUG Engine.local] Post-process locust/locust-scenario
[2024-12-16 09:21:41,707 DEBUG Engine.local] Exception in post_process of LocustIOExecutor: Empty results, most likely locust-scenario (LocustIOExecutor) failed. Actual reason for this can be found in logs under /var/taurus/logs Traceback (most recent call last):
  File "/opt/venv/lib/python3.11/site-packages/bzt/modules/provisioning.py", line 165, in post_process
    raise ToolError(message, diagnostics)
bzt.ToolError: Empty results, most likely locust-scenario (LocustIOExecutor) failed. Actual reason for this can be found in logs under /var/taurus/logs

[2024-12-16 09:21:41,708 DEBUG Engine] post_process: Empty results, most likely locust-scenario (LocustIOExecutor) failed. Actual reason for this can be found in logs under /var/taurus/logs
Traceback (most recent call last):
  File "/opt/venv/lib/python3.11/site-packages/bzt/engine/engine.py", line 356, in post_process
    module.post_process()
  File "/opt/venv/lib/python3.11/site-packages/bzt/modules/provisioning.py", line 174, in post_process
    reraise(exc_info, exc_value)
  File "/opt/venv/lib/python3.11/site-packages/bzt/utils.py", line 111, in reraise
    raise exc
  File "/opt/venv/lib/python3.11/site-packages/bzt/modules/provisioning.py", line 165, in post_process
    raise ToolError(message, diagnostics)
bzt.ToolError: Empty results, most likely locust-scenario (LocustIOExecutor) failed. Actual reason for this can be found in logs under /var/taurus/logs

[2024-12-16 09:21:41,708 DEBUG Engine.consolidator] Consolidator buffer[0]: dict_keys([])
[2024-12-16 09:21:41,708 INFO Engine.final-stats] Test duration: 0:00:36
[2024-12-16 09:21:41,708 DEBUG Engine.console] No logger_handler or orig_stream was detected
[2024-12-16 09:21:41,708 INFO Engine.final-stats] Test duration: 0:00:36
@sasa-D-soni sasa-D-soni changed the title When running a test with more than 11 machines in Locust, CSV files are not generated. When running a test with more than 11 workers in Locust, CSV files are not generated. Dec 16, 2024
@sasa-D-soni
Copy link
Author

Upon further investigation, it seems that the issue is caused by the buffer being set to 0.

Successful Case:

[2024-09-05 09:16:51,564 DEBUG Engine.consolidator] Consolidator buffer[2]: dict_keys([1725527805, 1725527806])  
[2024-09-05 09:16:51,564 DEBUG Engine.consolidator] Merging into 1725527805  
[2024-09-05 09:16:51,564 DEBUG Engine.consolidator] Bypassing consolidation because of single result  
[2024-09-05 09:16:51,566 DEBUG Engine.consolidator] Processed datapoint: 1725527805/ConsolidatingAggregator@139739848636688  
[2024-09-05 09:16:51,566 DEBUG Engine.consolidator] Merging into 1725527806  
[2024-09-05 09:16:51,566 DEBUG Engine.consolidator] Bypassing consolidation because of single result  
[2024-09-05 09:16:51,567 DEBUG Engine.consolidator] Processed datapoint: 1725527806/ConsolidatingAggregator@139739848636688  

Failure Case:

[2024-09-05 09:54:41,633 DEBUG Engine.consolidator] Consolidator buffer[0]: dict_keys([])  

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant