You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
scan sitemap.xml
make an option to start the URL fuzzing with a set amount of async workers, -r 20 to start with 20 workers and if that doesn't work because of a DDoS engine or 429, it will automatically go down to 10 async workers. If you could set 25 instead of limits in 10's like 10, 20, 30, if it were set to 25 make it jump down to the next lowest one which would be 20 on the default program from 30 async workers except now cause of custom limit it's 25.
make an option to decline the async workers instead of in 10's from 100 to 90 etc add an option to make them go down in a custom set interval of like 2 workers or 5 so on...
The text was updated successfully, but these errors were encountered:
Thanks for the proposal!
Scanning the sitemap.xml is pretty easy to do - Can you come up with constant values for this configuration?
It currently works in the way that if it gets blocked - it lowers the number of workers.
Regarding the "-r" flag - Do you mean simply overriding the default workers num. ?
scan sitemap.xml
make an option to start the URL fuzzing with a set amount of async workers, -r 20 to start with 20 workers and if that doesn't work because of a DDoS engine or 429, it will automatically go down to 10 async workers. If you could set 25 instead of limits in 10's like 10, 20, 30, if it were set to 25 make it jump down to the next lowest one which would be 20 on the default program from 30 async workers except now cause of custom limit it's 25.
make an option to decline the async workers instead of in 10's from 100 to 90 etc add an option to make them go down in a custom set interval of like 2 workers or 5 so on...
The text was updated successfully, but these errors were encountered: