A command-line tool to save local Ollama AI conversations directly to your Obsidian vault. Built for MacOS.
- Python 3
- Ollama installed and configured
- obsidian-cli installed and configured
- Create the necessary directory:
sudo mkdir -p /usr/local/lib/ai2obsidian
- Copy the Python script:
sudo cp ollama_to_obsidian.py /usr/local/lib/ai2obsidian/
sudo chmod +x /usr/local/lib/ai2obsidian/ollama_to_obsidian.py
- Install the command-line tool:
sudo cp ai2ob /usr/local/bin/
sudo chmod +x /usr/local/bin/ai2ob
- Configure obsidian-cli by setting your default vault:
obsidian-cli set-default "your-vault-name"
Basic usage:
ai2ob "Your prompt here"
With custom filename:
ai2ob "Your prompt here" my_custom_filename
With different Ollama model:
ai2ob -m mistral "Your prompt here"
Combined options:
ai2ob -m mistral "Your prompt here" my_custom_filename
-m <model>
: Specify which Ollama model to use (default: llama3.1:8b)-h
: Show help message
The conversations will be saved in your Obsidian vault with the following structure:
Your-Vault/
└── AI_Conversations/
└── your_conversation.md
Each file includes:
- YAML frontmatter with:
- Creation date
- Year
- Type: ai_conversation
- The original prompt
- The AI's response in markdown format
- Automatically opens the new note in Obsidian after creation
- Includes the original prompt in the saved note
- Organizes conversations by date and time if no filename is specified
- Supports all Ollama models
If the command is not found:
which ai2ob
Should return /usr/local/bin/ai2ob
. If not, ensure /usr/local/bin
is in your PATH.
If you get permission errors when saving files:
sudo chown -R $(whoami) /usr/local/lib/ai2obsidian
MIT License