From ce3fe411f82a143d496b1c17b805708b90916a0a Mon Sep 17 00:00:00 2001 From: Joshua David Date: Thu, 18 Jul 2024 22:01:05 -0700 Subject: [PATCH] Update README.md to be more detailed --- README.md | 1 + 1 file changed, 1 insertion(+) diff --git a/README.md b/README.md index bfa080b..64578c4 100644 --- a/README.md +++ b/README.md @@ -154,6 +154,7 @@ The architecture incorporates several structural modifications to handle the inc - **Attention Mechanisms**: Enhanced attention mechanisms are integrated to ensure that the model can focus on relevant parts of the input sequence, even with the extended context. +- **Token-wise Attention**: Token-wise attention mechanisms are introduced to capture the contextual relationships between tokens, allowing the model to better understand the semantic meaning of the input. ### Performance and Applications