baiyuting

Results 16 issues of baiyuting

⨯ Get "https://github-production-release-asset-2e65be.s3.amazonaws.com/9384267/873dfe00-29ad-11eb-8076-41b7ccd28dad?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=AKIAIWNJYAX4CSVEH53A%2F20210122%2Fus-east-1%2Fs3%2Faws4_request&X-A mz-Date=20210122T084337Z&X-Amz-Expires=300&X-Amz-Signature=a3cf3699e0c6dc54326d815b46b8d30201a90921b6802f0cf596c5a65c99b6d3&X-Amz-SignedHeaders=host&actor_id=0&key_id=0&repo_id=9384267&response-content-disposition=attachment%3B%20filen ame%3Delectron-v8.5.5-win32-x64.zip&response-content-type=application%2Foctet-stream": read tcp 192.168.174.128:59333->52.217.73.36:443: wsarecv: An existing connection was forcibly closed by the remote host. github.com/develar/app-builder/pkg/download.(*Downloader).follow.func1 /Volumes/data/Documents/app-builder/pkg/download/downloader.go:158 github.com/develar/app-builder/pkg/download.(*Downloader).follow /Volumes/data/Documents/app-builder/pkg/download/downloader.go:186 github.com/develar/app-builder/pkg/download.(*Downloader).Download /Volumes/data/Documents/app-builder/pkg/download/downloader.go:80 github.com/develar/app-builder/pkg/electron.(*ElectronDownloader).doDownload /Volumes/data/Documents/app-builder/pkg/electron/electronDownloader.go:192 github.com/develar/app-builder/pkg/electron.(*ElectronDownloader).Download /Volumes/data/Documents/app-builder/pkg/electron/electronDownloader.go:177 github.com/develar/app-builder/pkg/electron.downloadElectron.func1.1...

I found `seq *= self.item_emb.embedding_dim ** 0.5` in function `log2feats(self, log_seqs)`, Is there any reason for adjusting seqs after embedding? `seqs = self.item_emb(torch.LongTensor(log_seqs).to(self.dev)) seqs *= self.item_emb.embedding_dim ** 0.5`

when I try to download data in huggingface, I failed. it seems that there's something wrong.

could I just directly use gpt2 in MAGIC search? Is there any comparison results between finetune gpt2 and fix gpt2 ?

比如说给一段文本,列出里面所涉及到的法律条文,可能有多个,这样的场景有考虑到吗?

did anyone reproduce the transformer network with frozen GPT-2? I enter command python train.py --only_prefix --data ./data/coco/oscar_split_ViT-B_32_train.pkl --out_dir ./coco_train/ --mapping_type transformer --num_layers 8 --prefix_length 40 --prefix_length_clip 40 model is trained...

when I tried to use the same model to generate text through beamsearch and argmax, I found beamsearch get worse result. I use the generate_beam() and generate2() in clip_prefix_captioning_inference.ipynb to...

after creating env geneface according to install_guide, may be it is better to installing ffmpeg first, this can avoid errors caused by ffmpeg, and make sure that the synthesized video...

I find training epochs are 15 epochs lavin. But for llava the epochs are only 3. I want to reduce the training epochs. Are there any suggestions on how to...

请问能提供一个 用于 mme 评估的 shell 脚本吗