Skip to content

Commit

Permalink
Mention the flash-attention restriction in the readme. (#1158)
Browse files Browse the repository at this point in the history
  • Loading branch information
LaurentMazare authored Oct 23, 2023
1 parent a11af79 commit 25c3cc4
Showing 1 changed file with 3 additions and 0 deletions.
3 changes: 3 additions & 0 deletions candle-examples/examples/stable-diffusion/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -50,6 +50,9 @@ cached.
Enabling flash-attention requires both a feature flag, `--feature flash-attn`
and using the command line flag `--use-flash-attn`.

Note that flash-attention-v2 is only compatible with Ampere, Ada, or Hopper GPUs
(e.g., A100/H100, RTX 3090/4090).

## Image to Image Pipeline
...

Expand Down

0 comments on commit 25c3cc4

Please sign in to comment.