Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

suggestions #8

Open
Damieo opened this issue Jul 22, 2021 · 1 comment
Open

suggestions #8

Damieo opened this issue Jul 22, 2021 · 1 comment
Labels
enhancement New feature or request good first issue Good for newcomers

Comments

@Damieo
Copy link

Damieo commented Jul 22, 2021

scan sitemap.xml
make an option to start the URL fuzzing with a set amount of async workers, -r 20 to start with 20 workers and if that doesn't work because of a DDoS engine or 429, it will automatically go down to 10 async workers. If you could set 25 instead of limits in 10's like 10, 20, 30, if it were set to 25 make it jump down to the next lowest one which would be 20 on the default program from 30 async workers except now cause of custom limit it's 25.
make an option to decline the async workers instead of in 10's from 100 to 90 etc add an option to make them go down in a custom set interval of like 2 workers or 5 so on...

@avilum avilum added enhancement New feature or request good first issue Good for newcomers labels Jul 23, 2021
@avilum
Copy link
Owner

avilum commented Jul 23, 2021

Thanks for the proposal!
Scanning the sitemap.xml is pretty easy to do - Can you come up with constant values for this configuration?
It currently works in the way that if it gets blocked - it lowers the number of workers.
Regarding the "-r" flag - Do you mean simply overriding the default workers num. ?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request good first issue Good for newcomers
Projects
None yet
Development

No branches or pull requests

2 participants