Jira Webhook LLM
Langfuse Integration
Overview
The application integrates with Langfuse for observability and analytics of LLM usage and webhook events. This integration provides detailed tracking of:
- Webhook events
- LLM model usage
- Error tracking
- Performance metrics
Configuration
Langfuse configuration is managed through both application.yml and environment variables.
application.yml
langfuse:
enabled: true
public_key: "pk-lf-..."
secret_key: "sk-lf-..."
host: "https://cloud.langfuse.com"
Environment Variables
LANGFUSE_ENABLED=true
LANGFUSE_PUBLIC_KEY="pk-lf-..."
LANGFUSE_SECRET_KEY="sk-lf-..."
LANGFUSE_HOST="https://cloud.langfuse.com"
Enable/Disable
To disable Langfuse integration:
- Set
langfuse.enabled: falseinapplication.yml - Or set
LANGFUSE_ENABLED=falsein your environment
Tracking Details
The following events are tracked:
- Webhook events
- Input payload
- Timestamps
- Issue metadata
- LLM processing
- Model used
- Input/output
- Processing time
- Errors
- Webhook processing errors
- LLM processing errors
- Validation errors
Viewing Data
Visit your Langfuse dashboard to view the collected metrics and traces.
Deployment Guide
Redis Configuration
The application requires Redis for caching and queue management. Configure Redis in application.yml:
redis:
host: "localhost"
port: 6379
password: ""
db: 0
Environment variables can also be used:
REDIS_HOST="localhost"
REDIS_PORT=6379
REDIS_PASSWORD=""
REDIS_DB=0
Worker Process Management
The application uses Celery for background task processing. Configure workers in application.yml:
celery:
workers: 4
concurrency: 2
max_tasks_per_child: 100
Start workers with:
celery -A jira-webhook-llm worker --loglevel=info
Monitoring Setup
The application provides Prometheus metrics endpoint at /metrics. Configure monitoring:
- Add Prometheus scrape config:
scrape_configs:
- job_name: 'jira-webhook-llm'
static_configs:
- targets: ['localhost:8000']
- Set up Grafana dashboard using the provided template
Rate Limiting
Rate limiting is configured in application.yml:
rate_limiting:
enabled: true
requests_per_minute: 60
burst_limit: 100
Health Check Endpoint
The application provides a health check endpoint at /health that returns:
{
"status": "OK",
"timestamp": "2025-07-14T01:59:42Z",
"components": {
"database": "OK",
"redis": "OK",
"celery": "OK"
}
}
System Requirements
Minimum system requirements:
- Python 3.9+
- Redis 6.0+
- 2 CPU cores
- 4GB RAM
- 10GB disk space
Required Python packages are listed in requirements.txt