Request for Python 3.12 wheel: flash-attn for CUDA 13 + PyTorch 2.9
#17
by
razvanab
- opened
Can you please compiled for Python 3.12 - flash-attn for CUDA 13 + PyTorch 2.9?
Thank you.
Can you please compiled for Python 3.12 - flash-attn for CUDA 13 + PyTorch 2.9?
Thank you.