Configuration
OWL's configuration is stored in ~/.owl/config.yaml.
Configuration File
Create or edit ~/.owl/config.yaml:
llm:
provider: ollama
host: http://localhost:11434
model: glm-4.6:cloud
timeout: 60
temperature: 0.7
daemon:
socket_path: ~/.owl/owl.sock
log_level: INFO
LLM Settings
provider
The LLM provider to use. Currently only ollama is supported.
llm:
provider: ollama
host
Ollama server address:
llm:
host: http://localhost:11434
model
The model to use for chat:
llm:
model: glm-4.6:cloud # Fast cloud model
# model: llama2:latest
# model: mistral:latest
# model: codellama:latest
timeout
Request timeout in seconds:
llm:
timeout: 60 # Increase for slower models
temperature
Sampling temperature (0.0 - 1.0):
llm:
temperature: 0.7 # Lower = more focused, Higher = more creative
Daemon Settings
socket_path
Unix socket location:
daemon:
socket_path: ~/.owl/owl.sock
log_level
Logging verbosity:
daemon:
log_level: INFO # DEBUG, INFO, WARNING, ERROR
Environment Variables
Override config values with environment variables:
# LLM settings
export OWL_LLM_HOST=http://localhost:11434
export OWL_LLM_MODEL=llama2:latest
export OWL_LLM_TIMEOUT=120
export OWL_LLM_PROVIDER=ollama
# Daemon settings
export OWL_LOG_LEVEL=DEBUG
export OWL_SOCKET_PATH=/custom/path/owl.sock
# Base directory (default: ~/.owl)
export OWL_HOME=/custom/owl/home
Data Directories
OWL uses these directories (relative to OWL_HOME or ~/.owl):
| Path | Purpose |
|---|---|
config.yaml | Configuration file |
soul.yaml | OWL's character and personality |
memory.md | Human-readable memory |
memory/owl.db | SQLite database |
knowledge/chroma/ | Vector database |
owl.sock | Unix socket (runtime) |
Example Configurations
Development (Fast)
llm:
provider: ollama
host: http://localhost:11434
model: glm-4.6:cloud
timeout: 30
temperature: 0.7
daemon:
log_level: DEBUG
Production (Reliable)
llm:
provider: ollama
host: http://localhost:11434
model: llama2:70b
timeout: 120
temperature: 0.5
daemon:
log_level: WARNING
Remote Ollama
llm:
provider: ollama
host: http://<host-ip>:11434
model: mixtral:latest
timeout: 90
temperature: 0.7
Changing Models at Runtime
You can switch models without restarting:
# In the CLI
/status # See current model
# Restart daemon with different model
# Terminal 1: Ctrl+C to stop owld
OWL_LLM_MODEL=mistral:latest owld