diff --git a/README.md b/README.md
index 16510a89..ac9068a7 100644
--- a/README.md
+++ b/README.md
@@ -69,8 +69,8 @@ To quickly get a feel for the Multi-Agent Orchestrator, check out our [Demo App]
Get hands-on experience with the Multi-Agent Orchestrator through our diverse set of examples:
- **Ready-to-run Scripts**: Start locally with our collection of standalone scripts in both Python and TypeScript.
-- **Demo Applications**:
- - [Chat Demo App](https://awslabs.github.io/multi-agent-orchestrator/cookbook/examples/chat-demo-app/):
+- **Demo Applications**:
+ - [Chat Demo App](https://awslabs.github.io/multi-agent-orchestrator/cookbook/examples/chat-demo-app/):
- Explore multiple specialized agents handling various domains like travel, weather, math, and health
- [E-commerce Support Simulator](https://awslabs.github.io/multi-agent-orchestrator/cookbook/examples/ecommerce-support-simulator/): Experience AI-powered customer support with:
- Automated response generation for common queries
@@ -83,8 +83,8 @@ Get hands-on experience with the Multi-Agent Orchestrator through our diverse se
- [`chat-chainlit-app`](https://github.com/awslabs/multi-agent-orchestrator/tree/main/examples/chat-chainlit-app): Chat application built with Chainlit
- [`fast-api-streaming`](https://github.com/awslabs/multi-agent-orchestrator/tree/main/examples/fast-api-streaming): FastAPI implementation with streaming support
- [`text-2-structured-output`](https://github.com/awslabs/multi-agent-orchestrator/tree/main/examples/text-2-structured-output): Natural Language to Structured Data
-
-
+
+
All examples are available in both Python and TypeScript implementations. Check out our [documentation](https://awslabs.github.io/multi-agent-orchestrator/) for comprehensive guides on setting up and using the Multi-Agent Orchestrator!
@@ -192,7 +192,7 @@ if (response.streaming == true) {
### Python Version
-#### Installation
+#### Core Installation
```bash
# Optional: Set up a virtual environment
@@ -201,7 +201,7 @@ source venv/bin/activate # On Windows use `venv\Scripts\activate`
pip install multi-agent-orchestrator
```
-#### Usage
+#### Default Usage
Here's an equivalent Python example demonstrating the use of the Multi-Agent Orchestrator with a Bedrock LLM Agent and a Lex Bot Agent:
@@ -289,6 +289,19 @@ These examples showcase:
3. The orchestrator's ability to route requests to the most appropriate agent based on the input.
4. Handling of both streaming and non-streaming responses from different types of agents.
+### Working with Anthropic or OpenAI
+If you want to use Anthropic or OpenAI for classifier and/or agents, make sure to install the multi-agent-orchestrator with the relevant extra feature.
+```bash
+pip install multi-agent-orchestrator[anthropic]
+pip install multi-agent-orchestrator[openai]
+```
+
+### Full package installation
+For a complete installation (including Anthropic and OpenAi):
+```bash
+pip install multi-agent-orchestrator[all]
+```
+
## 🤝 Contributing
diff --git a/docs/src/content/docs/classifiers/built-in/anthropic-classifier.mdx b/docs/src/content/docs/classifiers/built-in/anthropic-classifier.mdx
index a85d9760..cd3ffee3 100644
--- a/docs/src/content/docs/classifiers/built-in/anthropic-classifier.mdx
+++ b/docs/src/content/docs/classifiers/built-in/anthropic-classifier.mdx
@@ -16,6 +16,11 @@ The Anthropic Classifier extends the abstract `Classifier` class and uses the An
### Basic Usage
+⚠️ To use Anthropic Classifier, make sure you have installed the multi-agent-orchestrator with anthropic feature (see [python installation](/multi-agent-orchestrator/general/quickstart#-get-started))
+```bash
+pip install multi-agent-orchestrator[anthropic]
+```
+
To use the AnthropicClassifier, you need to create an instance with your Anthropic API key and pass it to the Multi-Agent Orchestrator:
import { Tabs, TabItem } from '@astrojs/starlight/components';
diff --git a/docs/src/content/docs/classifiers/built-in/openai-classifier.mdx b/docs/src/content/docs/classifiers/built-in/openai-classifier.mdx
index d6e4cd03..8da836aa 100644
--- a/docs/src/content/docs/classifiers/built-in/openai-classifier.mdx
+++ b/docs/src/content/docs/classifiers/built-in/openai-classifier.mdx
@@ -16,6 +16,12 @@ The OpenAI Classifier extends the abstract `Classifier` class and uses the OpenA
## Basic Usage
+⚠️ To use OpenAI Classifier, make sure you have installed the multi-agent-orchestrator with openai feature (see [python installation](/multi-agent-orchestrator/general/quickstart#-get-started))
+
+```bash
+pip install multi-agent-orchestrator[openai]
+```
+
To use the OpenAIClassifier, you need to create an instance with your OpenAI API key and pass it to the Multi-Agent Orchestrator:
import { Tabs, TabItem } from '@astrojs/starlight/components';
diff --git a/docs/src/content/docs/general/quickstart.mdx b/docs/src/content/docs/general/quickstart.mdx
index 060b2c2e..3643a44b 100644
--- a/docs/src/content/docs/general/quickstart.mdx
+++ b/docs/src/content/docs/general/quickstart.mdx
@@ -38,7 +38,7 @@ To help you kickstart with the Multi-Agent Orchestrator framework, we'll walk yo
2. Authenticate with your AWS account
-This quickstart demonstrates the use of Amazon Bedrock for both classification and agent responses.
+This quickstart demonstrates the use of Amazon Bedrock for both classification and agent responses.
To authenticate with your AWS account, follow these steps:
@@ -61,8 +61,8 @@ By default, the framework is configured as follows:
> **Important**
->
-> These are merely default settings and can be easily changed to suit your needs or preferences.
+>
+> These are merely default settings and can be easily changed to suit your needs or preferences.
@@ -91,7 +91,10 @@ Ensure you have [requested access](https://docs.aws.amazon.com/bedrock/latest/us
```bash
- pip install multi-agent-orchestrator
+ pip install multi-agent-orchestrator # for core dependencies
+ pip install multi-agent-orchestrator[anthropic] # for Anthropic classifier and agent
+ pip install multi-agent-orchestrator[openai] # for OpenAI classifier and agent
+ pip install multi-agent-orchestrator[all] # for all packages including Anthropic and OpenAI
```
@@ -165,7 +168,7 @@ Ensure you have [requested access](https://docs.aws.amazon.com/bedrock/latest/us
streaming: true
})
);
-
+
orchestrator.addAgent(
new BedrockLLMAgent({
name: "Health Agent",
@@ -227,7 +230,7 @@ Ensure you have [requested access](https://docs.aws.amazon.com/bedrock/latest/us
console.error("Received unexpected chunk type:", typeof chunk);
}
}
- console.log();
+ console.log();
} catch (error) {
console.error("An error occurred:", error);
// Here you could also add more specific error handling if needed
diff --git a/python/README.md b/python/README.md
index b6b04e8b..d09b65ba 100644
--- a/python/README.md
+++ b/python/README.md
@@ -72,7 +72,7 @@ Check out our [documentation](https://awslabs.github.io/multi-agent-orchestrator
-### Installation
+### Core Installation
```bash
# Optional: Set up a virtual environment
@@ -81,7 +81,7 @@ source venv/bin/activate # On Windows use `venv\Scripts\activate`
pip install multi-agent-orchestrator
```
-### Usage
+#### Default Usage
Here's an equivalent Python example demonstrating the use of the Multi-Agent Orchestrator with a Bedrock LLM Agent and a Lex Bot Agent:
@@ -176,6 +176,19 @@ This example showcases:
4. Handling of both streaming and non-streaming responses from different types of agents.
+### Working with Anthropic or OpenAI
+If you want to use Anthropic or OpenAI for classifier and/or agents, make sure to install the multi-agent-orchestrator with the relevant extra feature.
+```bash
+pip install multi-agent-orchestrator[anthropic]
+pip install multi-agent-orchestrator[openai]
+```
+
+### Full package installation
+For a complete installation (including Anthropic and OpenAi):
+```bash
+pip install multi-agent-orchestrator[all]
+```
+
## 🤝 Contributing
We welcome contributions! Please see our [Contributing Guide](https://raw.githubusercontent.com/awslabs/multi-agent-orchestrator/main/CONTRIBUTING.md) for more details.