runtime error

Exit code: 1. Reason: οΏ½β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ| 5/5 [00:09<00:00, 1.93s/it] Download complete: 100%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ| 16.4G/16.4G [00:09<00:00, 1.75GB/s] Traceback (most recent call last): File "/home/user/app/app.py", line 52, in <module> model = AutoModelForCausalLM.from_pretrained( File "/usr/local/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py", line 372, in from_pretrained return model_class.from_pretrained( File "/usr/local/lib/python3.10/site-packages/transformers/modeling_utils.py", line 4072, in from_pretrained model = cls(config, *model_args, **model_kwargs) File "/usr/local/lib/python3.10/site-packages/transformers/models/qwen3/modeling_qwen3.py", line 460, in __init__ super().__init__(config) File "/usr/local/lib/python3.10/site-packages/transformers/modeling_utils.py", line 1295, in __init__ self.config._attn_implementation_internal = self._check_and_adjust_attn_implementation( File "/usr/local/lib/python3.10/site-packages/transformers/modeling_utils.py", line 1879, in _check_and_adjust_attn_implementation lazy_import_flash_attention(applicable_attn_implementation) File "/usr/local/lib/python3.10/site-packages/transformers/modeling_flash_attention_utils.py", line 165, in lazy_import_flash_attention _flash_fn, _flash_varlen_fn, _pad_fn, _unpad_fn = _lazy_imports(implementation, attention_wrapper) File "/usr/local/lib/python3.10/site-packages/transformers/modeling_flash_attention_utils.py", line 94, in _lazy_imports from flash_attn import flash_attn_func, flash_attn_varlen_func File "/usr/local/lib/python3.10/site-packages/flash_attn/__init__.py", line 3, in <module> from flash_attn.flash_attn_interface import ( File "/usr/local/lib/python3.10/site-packages/flash_attn/flash_attn_interface.py", line 15, in <module> import flash_attn_2_cuda as flash_attn_gpu ModuleNotFoundError: No module named 'flash_attn_2_cuda' Download complete: 100%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ| 16.4G/16.4G [00:10<00:00, 1.64GB/s]

Container logs:

Fetching error logs...