Kermit

Results 11 comments of Kermit

修改为cookie登录也报错了 ``` (d2l) D:\file\file\Win10Share\md\python\code\MediaCrawler>python main.py 2024-04-16 18:05:24 MediaCrawler INFO [DouYinLogin.login_by_cookies] Begin login douyin by cookie ... 2024-04-16 18:05:30 MediaCrawler INFO [DouYinLogin.begin] login finished then check login state ... 2024-04-16 18:05:30...

解决了,谢谢,这个可以更新到常见问题里面,我给你提PR

你好,出现这个报错是不是账号被抖音封禁了,如果是的可否配置参数进行sleep操作减少访问频次? ``` (tiktokdownload) D:\file\file\Win10Share\md\python\code\MediaCrawler>python main.py 2024-04-17 13:44:36 MediaCrawler INFO [DouYinLogin.login_by_cookies] Begin login douyin by cookie ... 2024-04-17 13:44:43 MediaCrawler INFO [DouYinLogin.begin] login finished then check login state ... 2024-04-17 13:44:43...

是否不支持bilibili视频下载,我安装f2框架之后使用`f2 bili -h`显示未安装此模块,下载源码之后`pip install -r requirement.txt`之后还是显示未安装此模块,是否不支持bilibili视频爬取? @Johnserf-Seed ``` (tiktokdownload) D:\file\file\Win10Share\md\python\code\MediaCrawler\tiktokdownload\TikTokDownload-main>f2 bili -h ERROR Error: No module named 'f2.apps.bilibili' Usage: f2 [OPTIONS] COMMAND [ARGS]... Error: No such command 'bili'. ``` 而`f2...

> > 是否不支持bilibili视频下载,我安装f2框架之后使用`f2 bili -h`显示未安装此模块,下载源码之后`pip install -r requirement.txt`之后还是显示未安装此模块,是否不支持bilibili视频爬取? @Johnserf-Seed > > ``` > > (tiktokdownload) D:\file\file\Win10Share\md\python\code\MediaCrawler\tiktokdownload\TikTokDownload-main>f2 bili -h > > ERROR Error: No module named 'f2.apps.bilibili' > > Usage: f2...

博主可以直接换个协议,到时候商用收费,版权给到贡献者

Thank you for your reply. I've tried two PCollection CoGoupByKey before, and if they are equal, they output the PCollection flag, but the problem is that I can't get the...

I think you are right, **because Spark provides MLib while K-means exists as a base algorithm example, I thought it would be easy to implement with Apache Beam**. But in...

> Try this : [#14 (comment)](https://github.com/tloen/alpaca-lora/issues/14#issuecomment-1471263165) Thank you very much for your help, I modified the following code to finish running the program. ```python model = PeftModel.from_pretrained( model, "tloen/alpaca-lora-7b", device_map={'':0})...

> Try this : [#14 (comment)](https://github.com/tloen/alpaca-lora/issues/14#issuecomment-1471263165) I find a more appropriate approach to deploy the model. ```python tokenizer = LlamaTokenizer.from_pretrained("decapoda-research/llama-7b-hf", device_map={'':0}) model = LlamaForCausalLM.from_pretrained( "decapoda-research/llama-7b-hf", load_in_8bit=True, torch_dtype=torch.float16, device_map={'':0} ) #...