S3Scanner
S3Scanner copied to clipboard
Scan for misconfigured S3 buckets across S3-compatible APIs!
https://github.com/sa7mon/S3Scanner/blob/6a6760338bfc7948ce445f032519ec20eaea34a1/S3Scanner/S3Bucket.py#L95 Hi, I am trying to use this scanner with URLs such as - `https://s3.amazonaws.com/bucketname`, OR - `https://s3.region.amazonaws.com/bucketname` But in both cases the code says I have an invalid bucket...
Hey there! I was just browsing around and saw issue 119 related to supporting JSON and figured I'd take a stab at it. I added the --json-file (-j) argument to...
1. New command `ls` to just output all the filenames and creation dates. 2. --enumerate gets the size of the bucket during scan. If you have a big list of...
For the bucket I was dumping, I was getting errors about files being skipped because they already existed. I checked and some known directories were showing up as regular files....
i start with --threads 200 or -t 200 and it says everytime dumping with 4 threads ? bug?
In order to avoid regressions that re-introduce #122 , a test bucket should be created with maliciously-named objects. (Refer to 2021-11-28 email from RyotaK for instructions)
`Download failed: Key: ****, Size: 0, LastModified: 2017-05-30 11:33:45+00:00 | [Errno 21] Is a directory: '***/item/20.Cf571aa6' -> '***/item/20'` It seems that when these errors occur, the program creates dummy files...
Good afternoon. Thank you very much for a great tool. It would be nice to be able to get a report in json format for the convenience of scan automation....
Is it possible use this tool as a pipeline job?
create a silent options, so that only vulnerable output will be printed out.