open_flamingo icon indicating copy to clipboard operation
open_flamingo copied to clipboard

An open-source framework for training large multimodal models.

Results 52 open_flamingo issues
Sort by recently updated
recently updated
newest added

Added gqa as new eval dataset

Hello! I'm new to multimodal training. Inspired by this exciting project, I hope to try my own fine-tuning experiments on interleaved data. Currently, I have downloaded the pro-trained model (3B)...

Hello everyone I'm student and I'm working on an chatbot. So I want to train flamingo on a costum pdf dataset. I need help. Thank you.

from open_flamingo import create_model_and_transforms model, image_processor, tokenizer = create_model_and_transforms( clip_vision_encoder_path="ViT-L-14", clip_vision_encoder_pretrained="openai", lang_encoder_path="anas-awadalla/mpt-1b-redpajama-200b", tokenizer_path="anas-awadalla/mpt-1b-redpajama-200b", cross_attn_every_n_layers=1, cache_dir="" # Defaults to ~/.cache ) when I run this code, this issue occurs: TypeError: Flamingo.__init__()...

bug

100%|███████████████████████████████████████| 933M/933M [01:59 [18](https://file+.vscode-resource.vscode-cdn.net/home/mraway/Desktop/src/open_flamingo/~/.cache/huggingface/modules/transformers_modules/anas-awadalla/mpt-7b/b772e556c8e8a17d087db6935e7cd019e5eefb0f/modeling_mpt.py:18) from .hf_prefixlm_converter import add_bidirectional_mask_if_missing, convert_hf_causal_lm_to_prefix_lm [19](https://file+.vscode-resource.vscode-cdn.net/home/mraway/Desktop/src/open_flamingo/~/.cache/huggingface/modules/transformers_modules/anas-awadalla/mpt-7b/b772e556c8e8a17d087db6935e7cd019e5eefb0f/modeling_mpt.py:19) from .meta_init_context import init_empty_weights [20](https://file+.vscode-resource.vscode-cdn.net/home/mraway/Desktop/src/open_flamingo/~/.cache/huggingface/modules/transformers_modules/anas-awadalla/mpt-7b/b772e556c8e8a17d087db6935e7cd019e5eefb0f/modeling_mpt.py:20) from .param_init_fns import MODEL_INIT_REGISTRY, generic_param_init_fn_ File [~/.cache/huggingface/modules/transformers_modules/anas-awadalla/mpt-7b/b772e556c8e8a17d087db6935e7cd019e5eefb0f/hf_prefixlm_converter.py:15](https://file+.vscode-resource.vscode-cdn.net/home/mraway/Desktop/src/open_flamingo/~/.cache/huggingface/modules/transformers_modules/anas-awadalla/mpt-7b/b772e556c8e8a17d087db6935e7cd019e5eefb0f/hf_prefixlm_converter.py:15) [13](https://file+.vscode-resource.vscode-cdn.net/home/mraway/Desktop/src/open_flamingo/~/.cache/huggingface/modules/transformers_modules/anas-awadalla/mpt-7b/b772e556c8e8a17d087db6935e7cd019e5eefb0f/hf_prefixlm_converter.py:13) import torch [14](https://file+.vscode-resource.vscode-cdn.net/home/mraway/Desktop/src/open_flamingo/~/.cache/huggingface/modules/transformers_modules/anas-awadalla/mpt-7b/b772e556c8e8a17d087db6935e7cd019e5eefb0f/hf_prefixlm_converter.py:14) from transformers.models.bloom.modeling_bloom import BaseModelOutputWithPastAndCrossAttentions,...

bug

**Is your feature request related to a problem? Please describe.** A clear and concise description of what the problem is. Ex. I'm always frustrated when [...] **Describe the workflow you...

I'm currently working on Open Flamingo which involves calculating perplexity scores for given sentence-image pairs. I've encountered an issue where the perplexity scores for two captions (one true and one...

Hey @anas-awadalla and co, thanks you all for your amazing work! I have a question regarding the used image encoder. Initializing OpenFlamingo with the demo code provided in the `README.md`:...

bug

Hey, first of all, thank you for the nice work! I have a question regarding the fixed precision of the parameters to float32 in line 321 of `train/train.py`: ``` python...

bug

I'm trying to follow the instructions in https://github.com/mlfoundations/open_flamingo/issues/228 in order to run inference on a GPU. My code looks as follows: ``` model, image_processor, tokenizer = create_model_and_transforms( clip_vision_encoder_path="ViT-L-14", clip_vision_encoder_pretrained="openai", lang_encoder_path="anas-awadalla/mpt-1b-redpajama-200b",...

bug