Fix: Add Transformers 4.47+ compatibility - Tested with macOS MPS

#63

Fix: Add Transformers 4.47+ compatibility

  • Wraps LlamaFlashAttention2 import in try-except for backward compatibility
  • Falls back to LlamaAttention when FlashAttention2 unavailable
  • Tested on Transformers 4.46.3 with macOS MPS
adeebaldkheel changed pull request title from Fix: Add Transformers 4.47+ compatibility - Wraps LlamaFlashAttention2 import in try-except for backward compatibility - Falls back to LlamaAttention when FlashAttention2 unavailable - Tested on Transformers 4.46.3 with macOS MPS - Minimal change: 1 file, 13 insertions, 5 deletions to Fix: Add Transformers 4.47+ compatibility - Wraps LlamaFlashAttention2 import in try-except for backward compatibility - Tested on Transformers 4.46.3 with macOS MPS
adeebaldkheel changed pull request title from Fix: Add Transformers 4.47+ compatibility - Wraps LlamaFlashAttention2 import in try-except for backward compatibility - Tested on Transformers 4.46.3 with macOS MPS to Fix: Add Transformers 4.47+ compatibility - Tested with macOS MPS
adeebaldkheel changed pull request status to closed

Sign up or log in to comment