Prerequisites
Before installing Spark, make sure you have:Python (v3.10 or higher)
Python (v3.10 or higher)
- macOS:
brew install python@3.11 - Ubuntu/Debian:
sudo apt install python3.11 python3-pip - Windows: Download from python.org
Git (for repository tasks)
Git (for repository tasks)
System Permissions
System Permissions
Spark runs scheduled tasks, so you may need:
- Linux/macOS: Access to cron or systemd
- Windows: Task Scheduler access
- Docker: For containerized deployments
Installation Methods
- pip (Recommended)
- pipx (Isolated)
- uvx (Instant)
- Docker (Recommended for Production)
Install Spark via pip:Verify installation:
Configuration
After installation, initialize Spark:config.json:
Create Your First Workflow
Create a workflow file inworkflows/daily-report.yaml:
Start Spark
Start the Spark scheduler:Verify Installation
Test that everything is working:Troubleshooting
Scheduler not starting
Scheduler not starting
Check if another instance is running:
Timezone issues
Timezone issues
Make sure timezone is correctly configured:
Workflow execution failures
Workflow execution failures
Check logs for details:
Notifications not sending
Notifications not sending
Verify notification configuration:
System Service Setup (Optional)
For production deployments, run Spark as a system service:- systemd (Linux)
- Docker Compose
Create Enable and start:
/etc/systemd/system/spark.service:
