Quick Start¶
Get dbt-nova running in under 5 minutes.
Prerequisites¶
- A dbt project with a compiled
manifest.json - An MCP-compatible AI client (Claude Desktop, Cursor, etc.)
Step 1: Install¶
All-in-one option
Use the installer script. It defaults to the slim release and downloads models on first run.
# Public repo (unauthenticated)
curl -fsSL https://raw.githubusercontent.com/joe-broadhead/dbt-nova/master/scripts/install.sh | \
bash -s -- --slim --non-interactive
# Private repo (authenticated)
GH_TOKEN="$(gh auth token)"
curl -fsSL -H "Authorization: Bearer ${GH_TOKEN}" \
https://raw.githubusercontent.com/joe-broadhead/dbt-nova/master/scripts/install.sh | \
DBT_NOVA_GITHUB_TOKEN="${GH_TOKEN}" bash -s -- --slim --non-interactive
# Optional: pre-warm models during install (instead of first semantic query)
curl -fsSL https://raw.githubusercontent.com/joe-broadhead/dbt-nova/master/scripts/install.sh | \
DBT_NOVA_EMBEDDINGS_CACHE_DIR="$HOME/.dbt-nova/models" \
DBT_NOVA_WARMUP_REQUIRED_MODELS=3 \
bash -s -- --slim --warm-models --non-interactive
# Optional: install Agent Skills into ~/.agents/skills
curl -fsSL https://raw.githubusercontent.com/joe-broadhead/dbt-nova/master/scripts/install.sh | \
bash -s -- --slim --install-skills --non-interactive
git clone https://github.com/joe-broadhead/dbt-nova.git
cd dbt-nova
cargo build --release
# default mode is partial (model files only); falls back if direct download fails
bash scripts/warm_models.sh
# optional integrity enforcement for direct HF fallback:
# DBT_NOVA_WARMUP_CHECKSUM_MODE=required DBT_NOVA_WARMUP_CHECKSUM_FILE=/path/to/checksums.txt bash scripts/warm_models.sh
# optional full mode: model files + embedding caches for a specific manifest
# DBT_NOVA_WARMUP_MANIFEST_PATH=/path/to/manifest.json bash scripts/warm_models.sh full
sudo cp target/release/dbt-nova /usr/local/bin/
Step 2: Configure¶
Set your manifest path:
export PATH="$HOME/.local/bin:$PATH"
export DBT_MANIFEST_PATH=/path/to/your/dbt/project/target/manifest.json
export DBT_NOVA_EMBEDDINGS_CACHE_DIR="$HOME/.dbt-nova/models"
Use the same DBT_NOVA_EMBEDDINGS_CACHE_DIR value in your MCP client config.
Pro Tip
Add this to your shell profile (~/.bashrc, ~/.zshrc) to persist across sessions.
Optional: Consume Prebuilt Artifacts (Read-Only)¶
If you build assets in CI with the reusable workflow, consumers can start without manual extraction by setting:
DBT_NOVA_STORAGE_INSTANCE_IDDBT_NOVA_STORAGE_READ_ONLY=trueDBT_NOVA_STORAGE_ARTIFACT_URIDBT_NOVA_METADATA_ARTIFACT_URI- optional
DBT_NOVA_MODELS_ARTIFACT_URI - optional
DBT_NOVA_ARTIFACT_FETCH_POLICY(recommendedif_missing)
See Prebuilt Asset Workflow for full producer/consumer setup.
Optional: Remote Manifests¶
dbt-nova supports loading manifests from remote sources:
# From S3
export DBT_NOVA_MANIFEST_URI=s3://my-bucket/manifest.json
# From GCS
export DBT_NOVA_MANIFEST_URI=gs://my-bucket/manifest.json
# From HTTP
export DBT_NOVA_MANIFEST_URI=https://example.com/manifest.json
# From Databricks DBFS
export DBT_NOVA_MANIFEST_URI=dbfs:///path/to/manifest.json
See Manifest Sources for authentication details.
Step 3: Connect to MCP Client¶
Claude Desktop¶
Add to ~/Library/Application Support/Claude/claude_desktop_config.json:
{
"mcpServers": {
"dbt-nova": {
"command": "dbt-nova",
"env": {
"DBT_MANIFEST_PATH": "/path/to/manifest.json",
"DBT_NOVA_EMBEDDINGS_CACHE_DIR": "/Users/<you>/.dbt-nova/models"
}
}
}
}
Cursor¶
Add to .cursor/mcp.json in your project:
{
"mcpServers": {
"dbt-nova": {
"command": "dbt-nova",
"env": {
"DBT_MANIFEST_PATH": "/path/to/manifest.json",
"DBT_NOVA_EMBEDDINGS_CACHE_DIR": "/Users/<you>/.dbt-nova/models"
}
}
}
}
See MCP Clients for more client configurations.
Step 4: Verify¶
Test the server directly:
You should see a JSON response listing all available tools.
You're Ready!
Once connected to your MCP client, you can ask questions like:
- "Search for models related to customers"
- "Show me the lineage of the orders model"
- "What's the test coverage for staging models?"
- "Score the metadata quality of my project"
Step 5: Run Your First Recipe¶
For recurring workflows (weekly KPI packs, MBRs, channel reports), start with recipes instead of ad-hoc SQL.
1) Discover a recipe¶
{
"name": "search_recipes",
"arguments": {
"topic": "weekly",
"query": "digital country",
"include_queries": true,
"limit": 5
}
}
2) Inspect the recipe contract¶
{
"name": "get_recipe",
"arguments": {
"recipe_id": "weekly_country_kpi_report",
"include_queries": true
}
}
3) Execute deterministically¶
{
"name": "run_recipe",
"arguments": {
"recipe_id": "weekly_country_kpi_report",
"parameters": {
"COUNTRY_CODE": "GB",
"WEEK_START": "2026-02-01",
"WEEK_END": "2026-02-07"
},
"stop_on_failure": true
}
}
See Analysis Recipes for naming conventions, parameter contracts, and execution order rules.
Next Steps¶
- CLI Commands - One-shot command groups, JSON mode, and exit codes
- Modes & Combinations - Install/runtime decision matrix
- Configuration Reference - All configuration options
- Tools Reference - Available tools and parameters
- Personas - Workflow guides by role
- Analysis Recipes - Deterministic recurring workflows
- Metadata Scoring - Understand quality scores