Skip to main content

Create Your First Automated Workflow in 5 Minutes

This guide will walk you through setting up Spark and creating a workflow that runs automatically on a schedule.

Step 1: Initialize Spark

# Initialize Spark workspace
spark init

# This creates:
# - .spark/ directory
# - workflows/ directory
# - config.json

Step 2: Configure Spark

Edit config.json:
{
  "scheduler": {
    "timezone": "America/New_York",
    "maxConcurrent": 5
  },
  "monitoring": {
    "enabled": true,
    "logLevel": "info"
  },
  "notifications": {
    "enabled": true,
    "onFailure": true
  }
}

Step 3: Create Your First Workflow

Create workflows/daily-standup.yaml:
name: daily-standup
description: Daily standup report generation
schedule: "0 9 * * 1-5"  # Monday-Friday at 9 AM
timezone: America/New_York

tasks:
  - name: get-commits
    type: git
    action: log
    args:
      since: "24 hours ago"
      format: "- %s by %an"

  - name: get-prs
    type: github
    action: list-prs
    args:
      state: open
      repo: "owner/repo"

  - name: format-report
    type: script
    script: |
      echo "# Daily Standup - $(date +%Y-%m-%d)"
      echo ""
      echo "## Yesterday's Commits"
      echo "$TASK_get_commits_OUTPUT"
      echo ""
      echo "## Open PRs"
      echo "$TASK_get_prs_OUTPUT"

  - name: send-to-slack
    type: slack
    channel: "#daily-standup"
    message: "{{ tasks.format-report.output }}"

notifications:
  onSuccess: true
  onFailure: true
Use crontab.guru to build cron expressions easily!

Step 4: Test Your Workflow

Before scheduling, test it:
# Dry run (doesn't execute, just validates)
spark workflow validate daily-standup

# Test run (executes once)
spark workflow run daily-standup --test

# Check output
spark workflow logs daily-standup --last

Step 5: Start the Scheduler

# Start Spark
spark start

# You should see:
# ⚡ Spark v1.0.0
# ✓ Workflows loaded: 1
# ✓ Scheduler started
# ✓ Next execution: daily-standup at 2025-10-31 09:00:00

Real-World Example: Automated Dependency Updates

Create a workflow that checks for dependency updates weekly:
name: dependency-updates
description: Weekly dependency update check
schedule: "0 10 * * 1"  # Every Monday at 10 AM

tasks:
  - name: check-npm-updates
    type: script
    script: |
      npm outdated --json > outdated.json

  - name: check-python-updates
    type: script
    script: |
      pip list --outdated --format=json > pip-outdated.json

  - name: create-pr-if-updates
    type: script
    script: |
      if [ -s outdated.json ] || [ -s pip-outdated.json ]; then
        # Create branch
        git checkout -b deps/weekly-updates-$(date +%Y%m%d)

        # Update dependencies
        npm update
        pip install --upgrade -r requirements.txt

        # Commit and push
        git add package*.json requirements.txt
        git commit -m "chore: weekly dependency updates"
        git push origin deps/weekly-updates-$(date +%Y%m%d)

        # Create PR
        gh pr create \
          --title "Weekly Dependency Updates" \
          --body "Automated dependency updates from Spark"
      fi

retryOn:
  - network-error

notifications:
  onSuccess: true
  channels: ["slack", "email"]

Common Workflow Patterns

  • Daily Reports
  • Continuous Monitoring
  • Backup Automation
name: daily-metrics
schedule: "0 8 * * *"  # Every day at 8 AM

tasks:
  - name: query-database
    type: sql
    query: "SELECT COUNT(*) as users FROM users WHERE created_at >= NOW() - INTERVAL 1 DAY"

  - name: generate-chart
    type: python
    script: |
      import matplotlib.pyplot as plt
      # Generate chart from query results

  - name: send-report
    type: email
    to: "team@example.com"
    subject: "Daily Metrics Report"
    attachments:
      - "metrics.png"

Workflow Features

Conditional Execution

tasks:
  - name: check-condition
    type: script
    script: |
      # Exit 0 for true, 1 for false
      [ "$(date +%u)" -lt 6 ] && exit 0 || exit 1

  - name: weekday-only-task
    type: script
    runIf: "{{ tasks.check-condition.success }}"
    script: |
      echo "This runs on weekdays only"

Parallel Tasks

tasks:
  - name: task-group-1
    parallel:
      - name: build-frontend
        type: script
        script: npm run build

      - name: build-backend
        type: script
        script: go build

      - name: run-tests
        type: script
        script: npm test

Error Handling

tasks:
  - name: risky-task
    type: script
    script: ./might-fail.sh
    retryAttempts: 3
    retryDelay: 60  # seconds
    continueOnError: false

onError:
  - name: cleanup
    type: script
    script: ./cleanup.sh

  - name: notify
    type: slack
    channel: "#alerts"
    message: "Workflow failed: {{ error.message }}"

Monitoring & Debugging

# View all workflows
spark workflow list

# Check execution history
spark history --workflow daily-standup --limit 10

# View logs
spark logs --workflow daily-standup --tail 50

# Real-time monitoring
spark monitor

# Check next scheduled runs
spark schedule --upcoming 24h

Common Use Cases

Code Quality Checks

schedule: "0 */4 * * *"  # Every 4 hours
tasks:
  - run linters
  - run tests
  - check coverage
  - update status badge

Data Pipelines

schedule: "0 1 * * *"  # Daily at 1 AM
tasks:
  - extract data from APIs
  - transform and clean
  - load to warehouse
  - update dashboards

Security Scans

schedule: "0 0 * * 0"  # Weekly on Sunday
tasks:
  - scan dependencies
  - check for CVEs
  - audit Docker images
  - create security report

Content Publishing

schedule: "0 10 * * 1,3,5"  # Mon/Wed/Fri
tasks:
  - generate content
  - create images
  - publish to blog
  - share on social media

Pro Tips

Keep secrets out of workflows:
tasks:
  - name: deploy
    type: script
    env:
      API_KEY: "${DEPLOY_API_KEY}"
      DB_PASSWORD: "${DB_PASS}"
    script: |
      ./deploy.sh
Set in .env:
DEPLOY_API_KEY=secret-key
DB_PASS=secret-pass
Always test workflows before letting them run automatically:
# Validate syntax
spark workflow validate my-workflow

# Dry run
spark workflow run my-workflow --dry-run

# Test run
spark workflow run my-workflow --test

# Only then, schedule it
spark start
Track how long tasks take:
# View execution stats
spark stats --workflow daily-standup

# Optimize slow tasks
spark analyze --workflow daily-standup

Next Steps

Advanced Workflows

Learn:
  • Multi-stage workflows
  • Dynamic task generation
  • Workflow templates

Integrations

Connect to:
  • GitHub Actions
  • Jenkins
  • AWS Lambda
  • Custom webhooks

Notifications

Configure:
  • Slack alerts
  • Email reports
  • Discord notifications
  • Custom webhooks

Production Deployment

Deploy:
  • As systemd service
  • In Docker
  • On Kubernetes

Troubleshooting

Check scheduler and cron expression:
spark status
spark workflow validate my-workflow

# Verify cron expression at crontab.guru
# Test manually first
spark workflow run my-workflow --test
Enable debug logging:
# In config.json
{
  "monitoring": {
    "logLevel": "debug"
  }
}

# Restart Spark
spark restart --debug
Verify timezone settings:
# Check current timezone
spark config get scheduler.timezone

# Set explicitly in workflow
# In workflow YAML:
# timezone: "America/Los_Angeles"

Congratulations! 🎉 You’ve created your first automated workflow with Spark. Your AI agents now work 24/7!

Join the Community

Share your workflows and automation ideas