transformers icon indicating copy to clipboard operation
transformers copied to clipboard

Added kernels from kernel hub for Bamba model

Open romitjain opened this issue 3 months ago • 23 comments

What does this PR do?

Adds support for mamba_ssm and causal_conv1d kernels from the kernel-hub in bamba models.

Fixes # (issue)

https://github.com/huggingface/transformers/issues/41208

Before submitting

  • [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
  • [X] Did you read the contributor guideline, Pull Request section?
  • [X] Was this discussed/approved via a Github issue or the forum? Please add a link to it if that's the case.
  • [ ] Did you make sure to update the documentation with your changes? Here are the documentation guidelines, and here are tips on formatting docstrings.
  • [ ] Did you write any new necessary tests?

Who can review?

@vasqu @MekkCyber @drbh

romitjain avatar Oct 13 '25 08:10 romitjain

Downstream changes have not been done (eg: modeling_granitemoehybrid.py) Since it was mentioned I can't update those files, not sure how to go about them.

romitjain avatar Oct 13 '25 09:10 romitjain

Hi @romitjain I'm working on a pr to make the kernel function mapping easier to do so we don't have to use new functions like lazy_load_mamba_ssm in the modeling files. Once merged we can refactor your PR to make it work with the new API, and then we can apply the changes to all other models using make fix-copies

MekkCyber avatar Oct 16 '25 08:10 MekkCyber

@MekkCyber Sure, no worries. Let me know (or you can share your PR here) once done, and I will udpate my PR

romitjain avatar Oct 16 '25 08:10 romitjain

sure here is the pr : https://github.com/huggingface/transformers/pull/41577

MekkCyber avatar Oct 16 '25 08:10 MekkCyber

@MekkCyber I believe I can refactor my PR by using your new mapping function now?

romitjain avatar Oct 17 '25 06:10 romitjain

Yes, i did that for falcon models here : https://github.com/huggingface/transformers/pull/41664, you can do that for bamba models then using the same API, however the mamba-ssm kernel needs to be fixed now before merging the PRs.

MekkCyber avatar Oct 17 '25 08:10 MekkCyber

@MekkCyber @vasqu

The mamba_ssm kernels were breaking for me because it was not able to import mamba_chunk_scan_combined, mamba_split_conv1d_scan_combined. This was working earlier but has broken in the latest revision. The default revision uses the incorrect output class (https://github.com/state-spaces/mamba/pull/807).

For my local testing, I made local changes to the mamba_ssm repo and tested the flow. However, we would need to fix it on kernels hub for this PR to work end to end.

Other than that, structurally, this is ready for review. Can you please have a look?

romitjain avatar Nov 10 '25 17:11 romitjain

@vasqu In the PR that you referenced, the lazy_load_kernel function will be called for every forward step ref

Since it is not cached, IMO either the approach in this PR or adding lru_cache decorator to lazy_load_kernel would be a better solve.

WDYT?

romitjain avatar Nov 12 '25 05:11 romitjain

@vasqu Since it is not cached, IMO either the approach in this PR or adding lru_cache decorator to lazy_load_kernel would be a better solve.

SGTM. I will still leave it to @MekkCyber tho as he's the main guy for anything kernels. In essence, it just should be the same way everywhere to avoid unnecessary tech debt

vasqu avatar Nov 12 '25 12:11 vasqu

@MekkCyber I have addressed your comments, PTAL

PS: It would require resolution of this issue: https://github.com/huggingface/transformers/pull/41540#discussion_r2518310902

romitjain avatar Nov 12 '25 17:11 romitjain

I did not run make fix-copies since I believe I should not edit the modeling files. But due to that the CI is failing, lmk what should be the steps?

romitjain avatar Nov 13 '25 04:11 romitjain

In your case, you indeed need to run make fix-copies to propagate the modification you did in the modular files to the real modeling files

SunMarc avatar Nov 13 '25 13:11 SunMarc

@SunMarc Done, can you please review? cc: @MekkCyber

romitjain avatar Nov 13 '25 16:11 romitjain

@ArthurZucker That is working for me! If kernels is not installed, it falls back to look at default system packages

romitjain avatar Nov 14 '25 15:11 romitjain

@MekkCyber Gentle reminder. What would be the next steps?

romitjain avatar Nov 17 '25 05:11 romitjain

Hi @romitjain, nothing to do on your side for now. I need to refactor the kernel function calls to make them less verbose and less visually intrusive.

MekkCyber avatar Nov 17 '25 13:11 MekkCyber

Thanks @MekkCyber 👍🏽

romitjain avatar Nov 17 '25 13:11 romitjain

@MekkCyber Is there any ETA for this merge? I can help out with other changes if required in lazy_load_kernel.

romitjain avatar Nov 24 '25 05:11 romitjain

Thanks @romitjain ! We are currently working on adding a decorator to use function kernels from the hub. Once it's ready we can integrate it. Will let you know

MekkCyber avatar Nov 24 '25 08:11 MekkCyber

@MekkCyber Would really appreciate an update on this. I am happy to help

I see two functions use_kernel_forward_from_hub and use_kernel_func_from_hub, do you suggest that they should be used instead?

romitjain avatar Dec 08 '25 06:12 romitjain

@MekkCyber I have resolved your comments. You might need to fix the upstream mamba-ssm package in kernel hub too.

https://github.com/huggingface/transformers/pull/41540#discussion_r2518310902

romitjain avatar Dec 09 '25 04:12 romitjain

Hi @romitjain ! It should be good now, you can use the kernels from this version : https://huggingface.co/kernels-community/mamba-ssm/tree/v0.0.4 instead of the clean-mamba-ssm one, let me know if you have any problems when you test it

MekkCyber avatar Dec 10 '25 16:12 MekkCyber

@MekkCyber Thanks for making the upstream fix, all the imports have been improved now to avoid nested calls. PTAL

romitjain avatar Dec 15 '25 08:12 romitjain

Thanks @romitjain ! can you fix the styling issues with make style ?

MekkCyber avatar Dec 15 '25 12:12 MekkCyber

Oops, I forgot to run that @MekkCyber Done now.

romitjain avatar Dec 15 '25 13:12 romitjain

Thanks @MekkCyber What would be the next steps for the merge?

romitjain avatar Dec 15 '25 14:12 romitjain

Failling tests seem unrelated! we just need to wait for a green CI to merge

MekkCyber avatar Dec 15 '25 21:12 MekkCyber

Seems like it is failing for the latest commits: https://github.com/huggingface/transformers/commits/main/

romitjain avatar Dec 16 '25 04:12 romitjain

[For maintainers] Suggested jobs to run (before merge)

run-slow: bamba, granitemoehybrid, jamba, mamba2, qwen3_next

github-actions[bot] avatar Dec 16 '25 11:12 github-actions[bot]

View the CircleCI Test Summary for this PR:

https://huggingface.co/spaces/transformers-community/circle-ci-viz?pr=41540&sha=65e283

github-actions[bot] avatar Dec 16 '25 11:12 github-actions[bot]