alexnasa commited on
Commit
e7bcfcf
·
verified ·
1 Parent(s): fe5e0bf

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +4 -40
README.md CHANGED
@@ -4,44 +4,8 @@ license: bsd-3-clause
4
  This repo provides the wheel for Flash Attention 3 for Hopper architecture for the current default cuda version on ZeroGPU spaces,
5
  which at the moment is Torch version: 2.8.0+cu128.
6
 
7
- To download and install the code in your Space you can copy/paste the code snippet below
8
 
9
- ```Python
10
- import subprocess, site, importlib
11
- from huggingface_hub import hf_hub_download
12
-
13
- # Re-discover all .pth/.egg-link files
14
- for sitedir in site.getsitepackages():
15
- site.addsitedir(sitedir)
16
-
17
- # Clear caches so importlib will pick up new modules
18
- importlib.invalidate_caches()
19
-
20
- def sh(cmd): subprocess.check_call(cmd, shell=True)
21
-
22
- flash_attention_installed = False
23
-
24
- try:
25
- print("Attempting to download and install FlashAttention wheel...")
26
- flash_attention_wheel = hf_hub_download(
27
- repo_id="alexnasa/flash-attn-3",
28
- repo_type="model",
29
- filename="128/flash_attn_3-3.0.0b1-cp39-abi3-linux_x86_64.whl",
30
- )
31
-
32
- sh(f"pip install {flash_attention_wheel}")
33
-
34
- # tell Python to re-scan site-packages now that the egg-link exists
35
- import importlib, site; site.addsitedir(site.getsitepackages()[0]); importlib.invalidate_caches()
36
-
37
- flash_attention_installed = True
38
- print("FlashAttention installed successfully.")
39
-
40
- except Exception as e:
41
- print(f"⚠️ Could not install FlashAttention: {e}")
42
- print("Continuing without FlashAttention...")
43
-
44
- import torch
45
- print(f"Torch version: {torch.__version__}")
46
- print(f"FlashAttention available: {flash_attention_installed}")
47
- ```
 
4
  This repo provides the wheel for Flash Attention 3 for Hopper architecture for the current default cuda version on ZeroGPU spaces,
5
  which at the moment is Torch version: 2.8.0+cu128.
6
 
7
+ You can add this to the requirements.txt file of your space
8
 
9
+ ```
10
+ flash-attn-3 @ https://huggingface.co/alexnasa/flash-attn-3/resolve/main/128/flash_attn_3-3.0.0b1-cp39-abi3-linux_x86_64.whl
11
+ ```