You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi @rgtjf, could you elaborate on your request please? Is there a specific change you'd like to see in Transformer Engine regarding flash-attn? At the moment, this is what we support from flash-attn v3.
wget -P $python_path/flashattn_hopper https://github.com/Dao-AILab/flash-attention/blob/284e2c6e5beff017996d72de6e028b2dc605acf8/hopper/flash_attn_interface.py
new version https://github.com/Dao-AILab/flash-attention/blob/main/hopper/flash_attn_interface.py
The text was updated successfully, but these errors were encountered: