chore: mark as deprecated in favour of layer-norm

#3
by sayakpaul HF Staff - opened
Files changed (1) hide show
  1. README.md +5 -0
README.md CHANGED
@@ -2,4 +2,9 @@
2
  tags:
3
  - kernels
4
  ---
 
 
 
 
 
5
  This CUDA extension implements fused dropout + residual + LayerNorm from the [flash-attention](https://github.com/Dao-AILab/flash-attention/tree/main/csrc/layer_norm) repo.
 
2
  tags:
3
  - kernels
4
  ---
5
+
6
+ > [!WARNING]
7
+ > This repository will soon be deleted as it's now deprecated. Please use [kernels-community/layer-norm](https://huggingface.co/kernels-community/layer-norm).
8
+
9
+
10
  This CUDA extension implements fused dropout + residual + LayerNorm from the [flash-attention](https://github.com/Dao-AILab/flash-attention/tree/main/csrc/layer_norm) repo.