varlen attention tutorial
This PR is a tutorial for a variable length attention API
:link: Helpful Links
:test_tube: See artifacts and rendered test results at hud.pytorch.org/pr/pytorch/tutorials/3660
- :page_facing_up: Preview Python docs built from this PR
Note: Links to docs will display an error until the docs builds have been completed.
:white_check_mark: No Failures
As of commit af7fe97f790eb542ff77b6bab62df3bab216d87b with merge base 63e3575c8b077fb8eded4348ccdb41b0e9bc965b ():
:green_heart: Looks good so far! There are no failures yet. :green_heart:
This comment was automatically generated by Dr. CI and updates every 15 minutes.
@svekars any tips for reviewing this code in a more human friendly way?
There is more info here: https://github.com/pytorch/tutorials#contributing and there is a script that will help you convert your .ipynb to .py
There is also an error here: https://github.com/pytorch/tutorials/actions/runs/19864783846/job/56924391840?pr=3660#step:9:4582 - let me check how should we test against 2.10.
@AlannaBurke can you please review this 2.10 tutorial.
Nightly preview: https://docs-preview.pytorch.org/pytorch/tutorials-nightly-preview/3660/intermediate/variable_length_attention_tutorial.html
looks great, make sure you've resolved all of @svekars 's comments
looks great, make sure you've resolved all of @svekars 's comments
yep resolved