Mark Vaz
Mark Vaz
Hi @di I missed your response, sorry about that: ``` $ sha256sum cuda_python-11.8.3-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl 04465e1179213e14a316f0eedbb50dc416f701f3f135d4ba16ce4b2892ec4286 cuda_python-11.8.3-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl ```
@di I don't have a `-a` option on `b2sum`. Here's my output ``` $ b2sum -l 256 cuda_python-11.8.3-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl 4642743c6a44b63cb592786a80a0aac21e8eeed8004dafc8b4919284a4a30295 cuda_python-11.8.3-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl ```
I can't upload the .whl directly, so uploading a .zip of the .whl [cuda_python-11.8.3-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.zip](https://github.com/user-attachments/files/16353395/cuda_python-11.8.3-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.zip)
@di Checking back, is there anything else you need here?
I think if the end goal is to publish this, it should be ok for you to try to upload it. @di Please go ahead.
Thanks Dustin! I think we can close the issue. Closing.
The linux-64 1xGPU config we should be able to start with, the rest we'll either need to share capacity from other pools or otherwise rack and stack.