Rick
Rick
Mh i also see this in iotop. i did ignore it thou until today. and luckily someone already opened an issue, thanks :3
@mahescho Ah danke, das mit IP hat mir das leben gerettet, ich hab nicht wirklich mitbekommen dass das script stirbt und nur ein teil des outputs an cmk schickt. Falls...
Just as a heads-up, i had the same problem, but executing it with sudo the device lists outputs my devices. so it could be a permission problem edit: after fiddling...
Oh i forgot the memory output: ``` free -h total used free shared buff/cache available Mem: 3.7Gi 3.0Gi 209Mi 5.0Mi 739Mi 770Mi Swap: 0B 0B 0B ``` Could also be...
Thanks and its nothing urgend, i'm quite releaved that i'm not the only one :D
@CYBERNEURONES waits so its an llm scanner?! mhh i let caddy write an access.log now.. the first view requests are valid fedi request.. well i keep an eye on it....
i tried set the nobot tag in caddy now, and also i use this caddy approach to block known bots: https://darthvi.com/post/forbidden-for-robots/
@erikvanoosten i also added a 403 for bots with specific useragents, cause many of them ignore robots.txt.. this helped me a lot: ``` @botForbidden header_regexp User-Agent "(?i)AdsBot-Google|Amazonbot|anthropic-ai|Applebot|Applebot-Extended|AwarioRssBot|AwarioSmartBot|Bytespider|CCBot|ChatGPT|ChatGPT-User|Claude-Web|ClaudeBot|cohere-ai|DataForSeoBot|Diffbot|FacebookBot|Google-Extended|GPTBot|ImagesiftBot|magpie-crawler|omgili|Omgilibot|peer39_crawler|PerplexityBot|YouBoto|semrush|babbar" handle @botForbidden {...
Hi, since i got this problem again (crawlers circumvented my previous efforts and they tarn themself as valid browser requests now) i wrote a regex for caddy which blocks external...