Powered by Dagster5X Jobs is built on Dagster under the hood, providing enterprise-grade orchestration capabilities while keeping the complexity hidden from users.

Key capabilities
Visual workflow builder
Create complex data pipelines using an intuitive drag-and-drop interface:- Trigger nodes - Start workflows with manual, scheduled, webhook, or pull request triggers
- Task nodes - Execute data ingestion, dbt modeling, custom Python code, or parallel jobs
- Dependency management - Define execution order and parallel processing with visual connections
- Real-time preview - See your pipeline structure and flow before execution
Flexible scheduling options
Choose how your jobs are triggered based on your workflow needs:- Manual triggers
- Scheduled triggers
- Webhook triggers
- Pull request triggers
On-demand executionRun jobs instantly with a single click for immediate processing.Use cases:
- Testing and debugging workflows
- Ad-hoc data processing
- Emergency data updates
- Development and experimentation
Comprehensive task automation
Execute different types of work within your pipelines: Data ingestion:- Use the same 600+ connectors available in 5X Ingestion
- Seamlessly integrate data collection into your workflows
- Automatic dependency management with other pipeline steps
- Execute dbt runs with configurable commands and parameters
- Support for multiple repositories and branches
- Integration with dbt documentation generation
- Deferral support for comparing changes across environments
- Run Python scripts from GitHub repositories
- Support for Python versions 3.8.19 through 3.12.4
- Dependency management with requirements files
- Environment variable and secret injection
- Execute other jobs as sub-workflows
- Enable complex pipeline orchestration and reuse
- Hierarchical job management across your workspace
Real-time monitoring and management
Track job execution with comprehensive visibility:- Job run history - View all past executions with status, duration, and resource usage
- Real-time status - Monitor active jobs with live updates
- Detailed logs - Access execution logs for debugging and troubleshooting
- Performance metrics - Track compute usage, execution time, and resource consumption
- Draft management - Develop and test changes before publishing to production

Intelligent alerting system
Stay informed about your job execution with comprehensive notifications:- Success/failure alerts - Get notified when jobs complete or fail
- Multiple channels - Email and Slack integration
- Granular control - Set alerts at job level or workspace level
- Real-time delivery - Immediate notifications for critical events
Environment and resource management
Run jobs across different environments with proper isolation:- Environment configuration - Define separate environments for development, staging, and production
- App connection integration - Link environments to specific warehouse connections
- Environment variables - Manage secrets and configuration across environments
- Compute profiles - Configure resource allocation per job execution
- Concurrent execution control - Limit parallel job runs to manage resource usage
Getting started
Ready to automate your first data workflow? Here’s how to get started:Prerequisites
Set up app connectionsConfigure your data warehouse connection and environment before creating jobs.
Create your first job
Build a workflowLearn how to create triggers, add tasks, and build your first automated pipeline.