Skip to content

Commit

Permalink
Updated documentation for python installation with anthropic and openai
Browse files Browse the repository at this point in the history
  • Loading branch information
brnaba-aws committed Nov 26, 2024
1 parent 0c28f3d commit 223f71a
Show file tree
Hide file tree
Showing 5 changed files with 54 additions and 14 deletions.
25 changes: 19 additions & 6 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -69,8 +69,8 @@ To quickly get a feel for the Multi-Agent Orchestrator, check out our [Demo App]
Get hands-on experience with the Multi-Agent Orchestrator through our diverse set of examples:

- **Ready-to-run Scripts**: Start locally with our collection of standalone scripts in both Python and TypeScript.
- **Demo Applications**:
- [Chat Demo App](https://awslabs.github.io/multi-agent-orchestrator/cookbook/examples/chat-demo-app/):
- **Demo Applications**:
- [Chat Demo App](https://awslabs.github.io/multi-agent-orchestrator/cookbook/examples/chat-demo-app/):
- Explore multiple specialized agents handling various domains like travel, weather, math, and health
- [E-commerce Support Simulator](https://awslabs.github.io/multi-agent-orchestrator/cookbook/examples/ecommerce-support-simulator/): Experience AI-powered customer support with:
- Automated response generation for common queries
Expand All @@ -83,8 +83,8 @@ Get hands-on experience with the Multi-Agent Orchestrator through our diverse se
- [`chat-chainlit-app`](https://github.com/awslabs/multi-agent-orchestrator/tree/main/examples/chat-chainlit-app): Chat application built with Chainlit
- [`fast-api-streaming`](https://github.com/awslabs/multi-agent-orchestrator/tree/main/examples/fast-api-streaming): FastAPI implementation with streaming support
- [`text-2-structured-output`](https://github.com/awslabs/multi-agent-orchestrator/tree/main/examples/text-2-structured-output): Natural Language to Structured Data


All examples are available in both Python and TypeScript implementations. Check out our [documentation](https://awslabs.github.io/multi-agent-orchestrator/) for comprehensive guides on setting up and using the Multi-Agent Orchestrator!


Expand Down Expand Up @@ -192,7 +192,7 @@ if (response.streaming == true) {

### Python Version

#### Installation
#### Core Installation

```bash
# Optional: Set up a virtual environment
Expand All @@ -201,7 +201,7 @@ source venv/bin/activate # On Windows use `venv\Scripts\activate`
pip install multi-agent-orchestrator
```

#### Usage
#### Default Usage

Here's an equivalent Python example demonstrating the use of the Multi-Agent Orchestrator with a Bedrock LLM Agent and a Lex Bot Agent:

Expand Down Expand Up @@ -289,6 +289,19 @@ These examples showcase:
3. The orchestrator's ability to route requests to the most appropriate agent based on the input.
4. Handling of both streaming and non-streaming responses from different types of agents.

### Working with Anthropic or OpenAI
If you want to use Anthropic or OpenAI for classifier and/or agents, make sure to install the multi-agent-orchestrator with the relevant extra feature.
```bash
pip install multi-agent-orchestrator[anthropic]
pip install multi-agent-orchestrator[openai]
```

### Full package installation
For a complete installation (including Anthropic and OpenAi):
```bash
pip install multi-agent-orchestrator[all]
```


## 🤝 Contributing

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -16,6 +16,11 @@ The Anthropic Classifier extends the abstract `Classifier` class and uses the An

### Basic Usage

⚠️ To use Anthropic Classifier, make sure you have installed the multi-agent-orchestrator with anthropic feature (see [python installation](/multi-agent-orchestrator/general/quickstart#-get-started))
```bash
pip install multi-agent-orchestrator[anthropic]
```

To use the AnthropicClassifier, you need to create an instance with your Anthropic API key and pass it to the Multi-Agent Orchestrator:

import { Tabs, TabItem } from '@astrojs/starlight/components';
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -16,6 +16,12 @@ The OpenAI Classifier extends the abstract `Classifier` class and uses the OpenA

## Basic Usage

⚠️ To use OpenAI Classifier, make sure you have installed the multi-agent-orchestrator with openai feature (see [python installation](/multi-agent-orchestrator/general/quickstart#-get-started))

```bash
pip install multi-agent-orchestrator[openai]
```

To use the OpenAIClassifier, you need to create an instance with your OpenAI API key and pass it to the Multi-Agent Orchestrator:

import { Tabs, TabItem } from '@astrojs/starlight/components';
Expand Down
15 changes: 9 additions & 6 deletions docs/src/content/docs/general/quickstart.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -38,7 +38,7 @@ To help you kickstart with the Multi-Agent Orchestrator framework, we'll walk yo

2. Authenticate with your AWS account

This quickstart demonstrates the use of Amazon Bedrock for both classification and agent responses.
This quickstart demonstrates the use of Amazon Bedrock for both classification and agent responses.

To authenticate with your AWS account, follow these steps:

Expand All @@ -61,8 +61,8 @@ By default, the framework is configured as follows:
<br/>

> **Important**
>
> These are merely default settings and can be easily changed to suit your needs or preferences.
>
> These are merely default settings and can be easily changed to suit your needs or preferences.
<br/>

Expand Down Expand Up @@ -91,7 +91,10 @@ Ensure you have [requested access](https://docs.aws.amazon.com/bedrock/latest/us
</TabItem>
<TabItem label="Python" icon="seti:python">
```bash
pip install multi-agent-orchestrator
pip install multi-agent-orchestrator # for core dependencies
pip install multi-agent-orchestrator[anthropic] # for Anthropic classifier and agent
pip install multi-agent-orchestrator[openai] # for OpenAI classifier and agent
pip install multi-agent-orchestrator[all] # for all packages including Anthropic and OpenAI
```
</TabItem>
</Tabs>
Expand Down Expand Up @@ -165,7 +168,7 @@ Ensure you have [requested access](https://docs.aws.amazon.com/bedrock/latest/us
streaming: true
})
);

orchestrator.addAgent(
new BedrockLLMAgent({
name: "Health Agent",
Expand Down Expand Up @@ -227,7 +230,7 @@ Ensure you have [requested access](https://docs.aws.amazon.com/bedrock/latest/us
console.error("Received unexpected chunk type:", typeof chunk);
}
}
console.log();
console.log();
} catch (error) {
console.error("An error occurred:", error);
// Here you could also add more specific error handling if needed
Expand Down
17 changes: 15 additions & 2 deletions python/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -72,7 +72,7 @@ Check out our [documentation](https://awslabs.github.io/multi-agent-orchestrator



### Installation
### Core Installation

```bash
# Optional: Set up a virtual environment
Expand All @@ -81,7 +81,7 @@ source venv/bin/activate # On Windows use `venv\Scripts\activate`
pip install multi-agent-orchestrator
```

### Usage
#### Default Usage

Here's an equivalent Python example demonstrating the use of the Multi-Agent Orchestrator with a Bedrock LLM Agent and a Lex Bot Agent:

Expand Down Expand Up @@ -176,6 +176,19 @@ This example showcases:
4. Handling of both streaming and non-streaming responses from different types of agents.


### Working with Anthropic or OpenAI
If you want to use Anthropic or OpenAI for classifier and/or agents, make sure to install the multi-agent-orchestrator with the relevant extra feature.
```bash
pip install multi-agent-orchestrator[anthropic]
pip install multi-agent-orchestrator[openai]
```

### Full package installation
For a complete installation (including Anthropic and OpenAi):
```bash
pip install multi-agent-orchestrator[all]
```

## 🤝 Contributing

We welcome contributions! Please see our [Contributing Guide](https://raw.githubusercontent.com/awslabs/multi-agent-orchestrator/main/CONTRIBUTING.md) for more details.
Expand Down

0 comments on commit 223f71a

Please sign in to comment.