Sq-CC

Results 9 issues of Sq-CC

It would be more perfect if it is possible to identify what cms the website uses

enhancement

Now there is a package.json file content as follows ![image](https://user-images.githubusercontent.com/58176451/133714815-0924be87-d3da-4c25-8b46-63cf00bcadb4.png) Let's go search: https://www.npmjs.com/search?q=prepack-fuzzer ![image](https://user-images.githubusercontent.com/58176451/133714942-4b2a719b-5f3b-4222-8366-39aff3f69f96.png) Now, we start to run the tool to test: confused -l npm package.json ![image](https://user-images.githubusercontent.com/58176451/133715011-ffd6dfa6-74a4-4af4-a2a3-5a0fe6a83f5d.png) He...

最大的缺陷就是不能通过网页任意编辑数据,比如自行添加banner等信息,方便日后查找

运行hooker后,输入包名,返回以下报错 Traceback (most recent call last): File "/home/kali/hooker/hooker.py", line 93, in attach online_session = rdev.attach(target) File "/usr/local/lib/python3.10/dist-packages/frida/core.py", line 26, in wrapper return f(*args, **kwargs) File "/usr/local/lib/python3.10/dist-packages/frida/core.py", line 165, in attach...

现有的站点爬虫功能可以参考https://github.com/Ciyfly/Argo进行优化,或者直接就把这个工具融进来,原本的这个站点爬虫功能实话说挺难用的,爬不到什么东西,太少了

建议将统计栏目所有项为0的可以批量一键删除,而不是自己全选删除,然后下一页全选删除

目前资产监控到新资产的时候只会将发现的新资产发到钉钉等,而同时发现的新资产的漏洞,例如敏感文件,nuclei的结果并不会发送到钉钉,这样只要监控的资产较多时,一个个看起来非常的浪费时间,很有可能就错过了最佳提交时间

有的waf会对cookie长度进行限制。