SEER SDK & Libraries

Integrate SEER monitoring into your scripts and applications with our official SDKs.

Available SDKs

Python SDK (seerpy)

Official Python library for easy integration with Python scripts and data pipelines.

pip install seerpy

View Python SDK Documentation →

REST API

Use our REST API for any programming language or platform.

https://api.ansrstudio.com

View REST API Documentation →
Quick Start Guide

1. Installation

# Install the Python SDK
pip install seerpy

# Or use the REST API directly
curl -X POST https://api.ansrstudio.com/monitoring \
  -H "Authorization: your_api_key" \
  -H "Content-Type: application/json"

2. Basic Usage

from seerpy import Seer

# Initialize the SDK
seer = Seer(api_key="your_api_key")

# Start monitoring
seer.start(job_name="my-pipeline")

try:
    # Your code here
    process_data()
    seer.success()
except Exception as e:
    seer.failure(str(e))
    raise

3. Advanced Features

# Add metadata and logging
seer.start(
    job_name="data-pipeline",
    metadata={
        "environment": "production",
        "version": "1.2.0"
    }
)

# Log progress
seer.log("Processing started")
seer.log("Checkpoint 1 complete", {"records": 1000})

# Send heartbeats for long-running jobs
seer.heartbeat()

# Complete with custom metadata
seer.success(metadata={"records_processed": 5000})
SDK Features

Core Features

  • Automatic Error Handling: Captures exceptions and stack traces
  • Progress Logging: Log checkpoints and intermediate results
  • Metadata Support: Attach custom data to runs
  • Heartbeat Monitoring: Track long-running processes
  • Offline Mode: Queue API calls when network is unavailable
  • Retry Logic: Automatic retry for failed API calls

Configuration Options

from seerpy import Seer

seer = Seer(
    api_key="your_api_key",
    base_url="https://api.ansrstudio.com",
    timeout=30,  # API timeout in seconds
    max_retries=3,  # Retry failed requests
    enable_offline=True  # Queue calls when offline
)
Common Patterns

Context Manager

Use Python's context manager for automatic cleanup:

with seer.monitor(job_name="data-pipeline"):
    process_data()
    # Automatically calls seer.success() on exit
    # Calls seer.failure() on exception

Decorator Pattern

Monitor functions with a simple decorator:

@seer.monitor_function(job_name="etl-process")
def extract_transform_load():
    # Your ETL logic here
    return results

Batch Operations

Monitor batch processing with progress updates:

seer.start(job_name="batch-processor")

for i, batch in enumerate(batches):
    process_batch(batch)
    seer.log(f"Processed batch {i+1}/{len(batches)}")
    seer.heartbeat()

seer.success(metadata={"total_batches": len(batches)})