Semantic Search
MUR v2 uses LanceDB for vector storage combined with BM25 keyword scoring for hybrid ranked results.
How It Works
When you run mur search "error handling", MUR:
- Semantic search — Converts your query to an embedding vector, finds similar patterns by cosine similarity
- BM25 keyword search — Traditional text matching for exact keyword hits
- Hybrid ranking — Combines both scores for the final ranked results
Setup
OpenAI (Recommended)
Best quality, very low cost (~$0.001 per 200 patterns):
export OPENAI_API_KEY=sk-...
# ~/.mur/config.yaml
search:
provider: openai
model: text-embedding-3-small
api_key_env: OPENAI_API_KEY
Ollama (Free, Local)
No API key needed. Runs entirely on your machine:
ollama pull qwen3-embedding
# ~/.mur/config.yaml
search:
provider: ollama
model: qwen3-embedding
Building the Index
After configuring a search provider, build (or rebuild) the index:
mur reindex
This processes all YAML pattern files and creates vector embeddings in ~/.mur/index/.
The index is fully rebuildable — YAML files are always the source of truth.
Using Search
# Natural language queries
mur search "how to handle authentication errors"
# → error-handling-auth (0.84)
# → retry-with-backoff (0.71)
# Technical queries
mur search "Swift Testing @Test macro"
# → swift-testing-macro (0.92)
Context Injection
For hook integration, MUR can inject relevant patterns based on the current context:
# Auto-detect project from current directory
mur context
# Compact output
mur context --compact
# Explicit query
mur inject --query "fix SwiftUI layout bug" --project my-app
See Also
- Configuration — Full config reference
- Patterns — Pattern format