transformers icon indicating copy to clipboard operation
transformers copied to clipboard

Flamingo Implementation

Open king159 opened this issue 2 years ago • 2 comments

What does this PR do?

Implementation of Flamingo models (https://arxiv.org/abs/2204.14198). Model weights trained by Open Flamingo team can be downloaded here. Weight conversion script is included.

Weights conversion can be run via:

python src/transformers/models/flamingo/converting_flamingo_to_hf.py \
    --old_ckpt_path /path/to/open/flamingo/weights \
    --new_hf_path /output/path

Models can then be loaded via:

model = transformers.FlamingoForConditionalGeneration.from_pretrained("/output/path")

Example:

import requests
import torch
import transformers
from PIL import Image

tokenizer = model.text_tokenizer
image_processor = transformers.CLIPImageProcessor()
demo_image_one = Image.open(
    requests.get(
        "http://images.cocodataset.org/val2017/000000039769.jpg", stream=True
    ).raw
)
demo_image_two = Image.open(
    requests.get(
        "http://images.cocodataset.org/test-stuff2017/000000028137.jpg", stream=True
    ).raw
)
query_image = Image.open(
    requests.get(
        "http://images.cocodataset.org/test-stuff2017/000000028352.jpg", stream=True
    ).raw
)
vision_x = (
    image_processor.preprocess(
        [demo_image_one, demo_image_two, query_image], return_tensors="pt"
    )["pixel_values"]
    .unsqueeze(1)
    .unsqueeze(0)
)
model.text_tokenizer.padding_side = "left"
lang_x = tokenizer(
    ["<image>An image of two cats.<|endofchunk|><image>An image of a bathroom sink.<|endofchunk|><image>An image of"],
    return_tensors="pt",
)

generated_text = model.generate(
    vision_x=vision_x,
    lang_x=lang_x["input_ids"],
    attention_mask=lang_x["attention_mask"],
    max_new_tokens=20,
    num_beams=3,
)

print("Generated text: ", model.text_tokenizer.decode(generated_text[0]))

Before submitting

  • [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
  • [x] Did you read the contributor guideline, Pull Request section?
  • [ ] Was this discussed/approved via a Github issue or the forum? Please add a link to it if that's the case.
  • [x] Did you make sure to update the documentation with your changes? Here are the documentation guidelines, and here are tips on formatting docstrings.
  • [ ] Did you write any new necessary tests?

Who can review?

Anyone in the community is free to review the PR once the tests have passed. Feel free to tag members/contributors who may be interested in your PR.

king159 avatar Apr 29 '23 12:04 king159

Respect! Openflamingo needs be built with huggingface transformers for more efficient training and inference.

We have already adapted it in our Otter model (an instruction tuned model based on flamingo). We uploaded a converted openflamingo-9b weights at luodian/openflamingo-9b-hf.

The model could be loaded via

model = transformers.FlamingoForConditionalGeneration.from_pretrained("luodian/openflamingo-9b-hf")

Luodian avatar Apr 29 '23 12:04 Luodian

The docs for this PR live here. All of your documentation changes will be reflected on that endpoint.

cc @amyeroberts and @younesbelkada

sgugger avatar May 01 '23 13:05 sgugger

Awesome work! Let us know when the PR is ready for review!

younesbelkada avatar May 02 '23 09:05 younesbelkada