Troubleshooting¶
Start with doctor¶
Run this first to catch environment/runtime issues before longer builds:
slideflow doctor --config-file config.yml --registry registry.py --strict --output-json doctor-result.json
Validation fails¶
Command:
Common causes:
- provider config missing required fields
- unresolved function names in registry
- invalid replacement/chart schema shape
Build fails before rendering¶
Common causes:
- missing Google credentials (
provider.config.credentials,GOOGLE_DOCS_CREDENTIALS, orGOOGLE_SLIDEFLOW_CREDENTIALS) - invalid template ID or target IDs (
slide.idforgoogle_slides, section marker ids forgoogle_docs) - unreadable CSV/JSON input path
- query/auth issues for Databricks connectors
For Google identity and Shared Drive setup, see Google Service Accounts & Shared Drives.
Google service-account and Shared Drive errors¶
| Error / Symptom | Likely Cause | Remediation |
|---|---|---|
storageQuotaExceeded | writing into service-account My Drive | write to Shared Drive output folders (presentation_folder_id, document_folder_id, drive_folder_id) |
consentRequiredForOwnershipTransfer | domain policy blocks transfer in My Drive | use Shared Drive outputs or disable transfer settings |
Ownership transfer is not supported for files in Shared Drives | transfer settings enabled for Shared Drive outputs | remove transfer_ownership_to and transfer_ownership_strict |
File not found for template/folder | runtime service account cannot access target | add service account to Shared Drive and share template/output folders |
Batch mode fails early¶
If using --params-path:
- ensure file exists and has headers
- ensure it has at least one data row
- ensure placeholders like
{region}match header names exactly
Charts fail to render¶
Common causes:
- trace config references unknown columns
- transformed data is empty after filters
- runtime image backend issues (
kaleido)
Helpful check:
dbt connector issues¶
Common causes:
- missing Databricks auth env vars
- missing BigQuery project/auth settings when using
warehouse.type: bigquery - invalid
package_urlor missing token env var used in URL - profile/target mismatch during dbt compile
Frequent CI symptom:
dbt compile failed: Path '/home/runner/.dbt' does not exist
Frequent dbt model-resolution symptom:
Ambiguous dbt model alias '...'
Fixes:
- set
dbt.profiles_dirin yourdbtsource config (orprofiles_dirin legacydatabricks_dbt), or - ensure
profiles.ymlexists at the dbt project root in the cloned repo.
For private dbt deps/repo access, ensure token env vars referenced by package_url or env_var(...) are present at runtime.
If alias ambiguity occurs, add one of these selectors in your source config:
model_unique_id(most specific)model_package_namemodel_selector_name
For BigQuery DBT execution, ensure at least one project-id source is available:
warehouse.project_id, orBIGQUERY_PROJECT, orGOOGLE_CLOUD_PROJECT.
And provide BigQuery auth via one of:
warehouse.credentials_path,warehouse.credentials_json, or- Application Default Credentials (
GOOGLE_APPLICATION_CREDENTIALS, workload identity, etc).
NumPy binary-compatibility warnings¶
If you see warnings like:
numpy.integer size changed, may indicate binary incompatibilitynumpy.floating size changed, may indicate binary incompatibility
these indicate a local wheel ABI mismatch. Rebuild the environment from scratch so NumPy/Pandas are installed as a compatible pair:
rm -rf .venv
uv sync --extra docs --extra dev --extra ai --locked
source .venv/bin/activate
uv run python scripts/ci/check_numpy_binary_compatibility.py
Notes:
- CI now runs the same ABI check script to prevent regressions.
- If your org mirrors wheels, ensure both NumPy and Pandas are resolved from the same mirror snapshot.
AI replacement issues¶
Common causes:
- missing provider credentials/API keys
- invalid provider name/model combination
- upstream rate-limit or provider outage
CI failures¶
- version mismatch:
- align
pyproject.tomlandslideflow/__init__.py - release branch mismatch:
- branch must match
release/vX.Y.Z - docs strict build failure:
- fix broken links or invalid markdown references