transformers icon indicating copy to clipboard operation
transformers copied to clipboard

Adding SegGPT

Open EduardoPach opened this issue 2 years ago • 4 comments

What does this PR do?

This PR adds SegGPT to the transformers library

Fixes https://github.com/huggingface/transformers/issues/27514

TO-DOs

  • [x] Finish Implementation
  • [x] Write conversion script
  • [x] Finish ImageProcessor
  • [x] Make all tests green

EduardoPach avatar Nov 27 '23 22:11 EduardoPach

@NielsRogge just pinging you here

EduardoPach avatar Nov 27 '23 23:11 EduardoPach

Hey @NielsRogge, could you take a look at the PR? Code quality is failing, but is unrelated and the documentation tests I don't know what is going on because it's failing on an example that it's not even in the modeling_seggpt.py

EduardoPach avatar Dec 08 '23 16:12 EduardoPach

Looks like we're nearly there. Just one comment, could we make the current post_process_masks method consistent with the existing post_process_semantic_segmentation and post_process_instance_segmentation methods in the library (perhaps also leveraging the same name)? These return a set of binary masks.

I have this notebook to showcase inference: https://colab.research.google.com/drive/1MZ0quroT0E2c5mnJjmjdDo_FrDXTUCQD.

NielsRogge avatar Jan 14 '24 11:01 NielsRogge

Looks like we're nearly there. Just one comment, could we make the current post_process_masks method consistent with the existing post_process_semantic_segmentation and post_process_instance_segmentation methods in the library (perhaps also leveraging the same name)? These return a set of binary masks.

I have this notebook to showcase inference: https://colab.research.google.com/drive/1MZ0quroT0E2c5mnJjmjdDo_FrDXTUCQD.

Some of the modifications I did might break your inference example. Instead of using post_process_masks you should now use post_process_semantic_segmentation here is an example of my own https://colab.research.google.com/drive/1UfgLOOVHZJhdxIzfIc6Z1njdOkGTFTJT?usp=sharing

EduardoPach avatar Jan 26 '24 22:01 EduardoPach

Awesome work! Thanks for all the work adding and iterating on this

Just some tiny nits. Please do not just mark comments as resolved when they are not - if you don't think the suggestion should be applied then comment with why on the comment before marking.

Main thing left to do is update all the checkpoints and make sure all the model tests (incl slow) pass with these checkpoints. Once that's all done it'll be ready to merge :)

@amyeroberts what do you mean by updating the checkpoints? Move to the appropriate org or re-upload them (it's just one) with the conversion script.

EduardoPach avatar Feb 21 '24 19:02 EduardoPach

@EduardoPach The checkpoints should be uploaded under the correct org - @NielsRogge can help you with that. Then all the checkpoint references in this PR need to be updated to point to the right location

amyeroberts avatar Feb 21 '24 21:02 amyeroberts

@EduardoPach The checkpoints should be uploaded under the correct org - @NielsRogge can help you with that. Then all the checkpoint references in this PR need to be updated to point to the right location

Hey @amyeroberts checkpoints were moved and tests are green

EduardoPach avatar Feb 24 '24 11:02 EduardoPach

The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.