Development Workflow
Local Development
-
Create a feature branch:
-
Develop models locally:
-
Query your tables on Dune:
- Remember to use the
dune.catalog prefix:
- Remember to use the
Pull Request Workflow
-
Push changes and open PR:
-
Automated CI runs:
- CI enforces that branch is up-to-date with main
- Runs modified models with
--full-refreshin isolated schema{team}__tmp_pr{number} - Runs tests on modified models
- Tests incremental run logic
-
Team review:
- Review transformation logic in GitHub
- Check CI results
- Approve and merge when ready
Production Deployment
The production workflow includes an hourly schedule (0 * * * *), but it’s commented out by default. You can enable it by uncommenting the corresponding lines when you’re ready to run production jobs automatically.- State comparison: Uses manifest from previous run to detect changes
- Full refresh modified models: Any changed models run with
--full-refresh - Incremental run: All models run with normal incremental logic
- Testing: All models are tested
- Notification: Email sent on failure
CI/CD with GitHub Actions
The template includes two GitHub Actions workflows:CI Workflow (.github/workflows/ci.yml)
Runs on every pull request:
DUNE_API_KEY
DUNE_TEAM_NAME
Production Workflow (.github/workflows/prod.yml)
Runs hourly on main branch:
Troubleshooting
Connection Issues
Problem:dbt debug fails with connection error.
Solution:
- Verify
DUNE_API_KEYandDUNE_TEAM_NAMEare set correctly - Check that you have Data Transformations enabled for your team
- Ensure
transformations: trueis in session properties
Models Not Appearing in Dune
Problem: Can’t find tables in Data Explorer or queries. Solution:- Check the Connectors section in Data Explorer under “My Data”
- Remember to use
dune.catalog prefix in queries - Verify the table was created in the correct schema
Incremental Models Not Working
Problem: Incremental models always do full refresh. Solution:- Check that
is_incremental()macro is used correctly - Verify the
unique_keyconfiguration matches your table structure - Ensure the target table exists before running incrementally
CI/CD Failures
Problem: GitHub Actions failing. Solution:- Verify secrets and variables are set correctly in GitHub
- Check that branch is up-to-date with main
- Review workflow logs for specific errors
Limitations
Metadata Discovery
Limited support for some metadata discovery queries likeSHOW TABLES or SHOW SCHEMAS in certain contexts. This may affect autocomplete in some BI tools.
Workaround: Use the Data Explorer or query information_schema directly.
Result Set Size
Large result sets may timeout. Consider:- Paginating with
LIMITandOFFSET - Narrowing filters to reduce data volume
- Breaking complex queries into smaller parts
Read-After-Write Consistency
Tables and views are available for querying immediately after creation, but catalog caching may cause brief delays (typically < 60 seconds) before appearing in some listing operations.Rate Limits
Rate limits for Data Transformations align with the Dune Analytics API:- Requests are subject to the same rate limiting as API executions
- Large query operations run on the Large Query Engine tier
- See Rate Limits for detailed information