Metadata-Version: 2.4
Name: firework-sandbox
Version: 0.1.0
Summary: Local sandbox client for Firecracker/Docker execution with E2B/Daytona ergonomics
Author: Firework Team
License: MIT
Project-URL: Homepage, https://github.com/firework-sandbox/firework
Project-URL: Documentation, https://github.com/firework-sandbox/firework#readme
Project-URL: Repository, https://github.com/firework-sandbox/firework
Project-URL: Issues, https://github.com/firework-sandbox/firework/issues
Keywords: sandbox,firecracker,docker,isolation,microvm,container,vm,virtualization
Classifier: Development Status :: 3 - Alpha
Classifier: Intended Audience :: Developers
Classifier: License :: OSI Approved :: MIT License
Classifier: Programming Language :: Python :: 3
Classifier: Programming Language :: Python :: 3.10
Classifier: Programming Language :: Python :: 3.11
Classifier: Programming Language :: Python :: 3.12
Classifier: Programming Language :: Python :: 3.13
Classifier: Topic :: Software Development :: Libraries :: Python Modules
Classifier: Topic :: System :: Emulators
Classifier: Framework :: AsyncIO
Classifier: Operating System :: POSIX :: Linux
Classifier: Operating System :: MacOS
Classifier: Typing :: Typed
Requires-Python: >=3.10
Description-Content-Type: text/markdown
License-File: LICENSE
Requires-Dist: aiohttp>=3.8.0
Provides-Extra: dev
Requires-Dist: hypothesis>=6.0.0; extra == "dev"
Requires-Dist: pytest>=7.0.0; extra == "dev"
Requires-Dist: pytest-asyncio>=0.21.0; extra == "dev"
Requires-Dist: build>=1.0.0; extra == "dev"
Requires-Dist: twine>=4.0.0; extra == "dev"
Dynamic: license-file

# Firework 🎆

A high-level, local-only client library for managing Firecracker/Docker sandboxes. No authentication, no remote APIs – just pure local execution with E2B/Daytona ergonomics.

[![PyPI version](https://badge.fury.io/py/firework-sandbox.svg)](https://badge.fury.io/py/firework-sandbox)
[![Python 3.10+](https://img.shields.io/badge/python-3.10+-blue.svg)](https://www.python.org/downloads/)
[![License: MIT](https://img.shields.io/badge/License-MIT-yellow.svg)](https://opensource.org/licenses/MIT)

## Features

- **Zero-config by default**: Works out-of-the-box with sensible local defaults
- **Backend agnostic**: Abstracts Firecracker, Docker, or custom runtimes
- **Async-first**: All operations are non-blocking with streaming support
- **Resource-aware**: Automatic cleanup, lifecycle management, metrics
- **Pre-built environments**: Python ML, PyTorch, TensorFlow, Node.js, and more
- **No network required**: Everything runs on localhost

## Installation

```bash
pip install firework-sandbox
```

No API keys, no daemon required.

## Quick Start

```python
import asyncio
from firework import Sandbox

async def main():
    # Create a sandbox with auto-cleanup
    async with Sandbox.create(template="python-ml") as sandbox:
        # Install packages
        await sandbox.process.exec("pip install numpy pandas")
        
        # Execute Python code
        result = await sandbox.process.exec("python -c 'import numpy; print(numpy.__version__)'")
        print(result.stdout)  # "1.24.0\n"
        
        # Upload a file
        await sandbox.filesystem.write("/app/script.py", """
import pandas as pd
df = pd.DataFrame({'a': [1, 2, 3], 'b': [4, 5, 6]})
df.to_csv('/app/output.csv', index=False)
print(df.describe())
""")
        
        # Run the script
        result = await sandbox.process.exec("python /app/script.py")
        print(result.stdout)
        
        # Download the result
        await sandbox.filesystem.download("/app/output.csv", "./output.csv")

asyncio.run(main())
```

## Pre-built Environments

Firework comes with pre-configured environments for common use cases:

| Environment | Description | Packages |
|-------------|-------------|----------|
| `base` | Minimal Python 3.11 | pip, setuptools |
| `python-ml` | Machine Learning | numpy, pandas, scikit-learn, matplotlib |
| `python-torch` | Deep Learning (PyTorch) | torch, torchvision, numpy |
| `python-tensorflow` | Deep Learning (TensorFlow) | tensorflow, keras, numpy |
| `python-data` | Data Engineering | polars, duckdb, pyarrow, sqlalchemy |
| `python-web` | Web Development | fastapi, uvicorn, httpx, pydantic |
| `python-llm` | LLM/AI Applications | openai, anthropic, langchain, transformers |
| `nodejs` | Node.js 20 | Node.js runtime |
| `nodejs-full` | Node.js with tools | typescript, ts-node |

```python
# Use a specific environment
sandbox = await Sandbox.create(template="python-torch")

# List all available environments
from firework import list_environments
for env in list_environments():
    print(f"{env['name']}: {env['description']}")
```

## Sandbox Lifecycle

### Create

```python
from firework import Sandbox

# Minimal (recommended)
sandbox = await Sandbox.create(template="base")

# Full control
sandbox = await Sandbox.create(
    template="python-ml",
    name="data-analysis-123",
    environment={"DEBUG": "true", "PYTHONUNBUFFERED": "1"},
    vcpu=2,
    memory_mb=1024,
    timeout_seconds=3600,
)

print(f"ID: {sandbox.id}")           # sbx_abc123...
print(f"Root: {sandbox.root_path}")  # ~/.firework/sandboxes/sbx_abc123
```

### Reconnect to Existing Sandbox

```python
# Reconnect to a running sandbox
sandbox = await Sandbox.reconnect("sbx_abc123")
```

### State Management

```python
# Check state
state = await sandbox.get_state()  # "running", "paused", "stopped"

# Pause/Resume (saves resources)
await sandbox.pause()
await sandbox.resume()

# Destroy (cleanup)
await sandbox.destroy()
```

### Auto-cleanup with Context Manager

```python
async with Sandbox.create(template="base") as sandbox:
    result = await sandbox.process.exec("echo Hello")
    # Automatically destroyed on exit
```

## Process Execution

### Blocking Execution

```python
result = await sandbox.process.exec(
    command="python /app/analyze.py",
    cwd="/workspace",
    environment={"INPUT": "/data/file.csv"},
    timeout_seconds=30
)

print(result.stdout)
print(result.stderr)
print(f"Exit code: {result.exit_code}")
print(f"Runtime: {result.runtime_seconds}s")
```

### Streaming Execution

```python
stream = await sandbox.process.exec_stream(
    command="python long_running.py",
    cwd="/app"
)

async for event in stream:
    match event.type:
        case "stdout":
            print(f"OUT: {event.content}", end="")
        case "stderr":
            print(f"ERR: {event.content}", end="")
        case "exit":
            print(f"DONE: exit code {event.exit_code}")
```

### Background Processes

```python
# Start a server in the background
server = await sandbox.process.start(
    command="python -m http.server 8000",
    cwd="/public",
    background=True
)

# Check if running
running = await server.is_running()

# Wait for completion or kill
try:
    exit_code = await server.wait(timeout_seconds=60)
except asyncio.TimeoutError:
    await server.kill()
```

### Batch Execution

```python
# Run multiple commands sequentially
results = await sandbox.process.batch_exec([
    "pip install -r requirements.txt",
    "python setup.py",
    "python train.py",
    "python evaluate.py"
], stop_on_error=True)

for i, result in enumerate(results):
    print(f"Command {i+1}: exit code {result.exit_code}")
```

## Filesystem Operations

### Read/Write Files

```python
# Read text
content = await sandbox.filesystem.read("/config.json")

# Read binary
data = await sandbox.filesystem.read_bytes("/model.bin")

# Write text
await sandbox.filesystem.write("/output.txt", "Hello World")

# Write binary
await sandbox.filesystem.write_bytes("/model.weights", model_bytes)
```

### Upload/Download Files

```python
# Upload single file
await sandbox.filesystem.upload("./local.csv", "/data/input.csv")

# Download single file
await sandbox.filesystem.download("/data/output.json", "./result.json")

# Upload entire directory
await sandbox.filesystem.upload_dir("./project", "/workspace")

# Download entire directory
await sandbox.filesystem.download_dir("/results", "./local_results")
```

### Directory Operations

```python
# List directory contents
files = await sandbox.filesystem.list("/data")
for f in files:
    print(f"{f.name}: {f.size} bytes ({f.type})")

# Create directory
await sandbox.filesystem.mkdir("/workspace/output", recursive=True)

# Remove file
await sandbox.filesystem.remove("/tmp/cache.txt")

# Remove directory
await sandbox.filesystem.remove_dir("/tmp/old_data")

# Check existence
exists = await sandbox.filesystem.exists("/app/script.py")
```

## Observability

### Metrics

```python
metrics = await sandbox.get_metrics()
print(f"CPU: {metrics.cpu_percent}%")
print(f"Memory: {metrics.memory_mb} MB")
print(f"Disk: {metrics.disk_mb_used} MB")
print(f"Uptime: {metrics.uptime_seconds}s")
print(f"Processes: {metrics.process_count}")
```

### Events

```python
def on_created(sandbox):
    print(f"Sandbox created: {sandbox.id}")

def on_destroyed(sandbox):
    print(f"Sandbox destroyed: {sandbox.id}")

sandbox.on("created", on_created)
sandbox.on("destroyed", on_destroyed)
```

## Error Handling

```python
from firework import (
    SandboxError,
    SandboxNotFound,
    SandboxTimeout,
    ProcessExecutionError,
    FilesystemError
)

try:
    result = await sandbox.process.exec("nonexistent_command")
except ProcessExecutionError as e:
    print(f"Command failed: exit code {e.exit_code}")
    print(f"Stderr: {e.stderr}")
except SandboxTimeout as e:
    print(f"Operation timed out after {e.timeout_seconds}s")
except SandboxNotFound as e:
    print(f"Sandbox not found: {e.sandbox_id}")
except FilesystemError as e:
    print(f"Filesystem error: {e.operation} on {e.path}")
except SandboxError as e:
    print(f"General sandbox error: {e}")
```

## CLI

```bash
# Create a sandbox
firework create --template python-ml --name my-sandbox

# List running sandboxes
firework list

# Execute command in sandbox
firework exec sbx_abc123 "python --version"

# Destroy sandbox
firework destroy sbx_abc123

# List available environments
firework env list

# Show environment details
firework env info python-ml

# Build an environment
firework env build python-torch --size 4096
```

## Configuration

### Programmatic Configuration

```python
from firework import LocalConfig, set_config

config = LocalConfig(
    runtime_dir="/custom/path/sandboxes",
    env_dir="/custom/path/environments",
    default_template="python-ml",
    default_timeout=120,
    default_vcpu=2,
    default_memory_mb=1024,
    log_level="DEBUG"
)
set_config(config)
```

### Environment Variables

| Variable | Description | Default |
|----------|-------------|---------|
| `FIREWORK_RUNTIME_DIR` | Sandbox runtime directory | `~/.firework/sandboxes` |
| `FIREWORK_ENV_DIR` | Built environments directory | `~/.firework/environments` |
| `FIREWORK_DEFAULT_TEMPLATE` | Default template | `base` |
| `FIREWORK_DEFAULT_TIMEOUT` | Default timeout (seconds) | `60` |
| `FIREWORK_DEFAULT_VCPU` | Default vCPU count | `1` |
| `FIREWORK_DEFAULT_MEMORY_MB` | Default memory (MB) | `512` |
| `FIREWORK_LOG_LEVEL` | Log level | `INFO` |

```bash
export FIREWORK_RUNTIME_DIR=/data/sandboxes
export FIREWORK_DEFAULT_MEMORY_MB=2048
```

## Complete Examples

### Data Analysis Pipeline

```python
async def analyze_csv(csv_path: str) -> dict:
    async with Sandbox.create(template="python-ml") as sandbox:
        # Upload data
        await sandbox.filesystem.upload(csv_path, "/data/input.csv")
        
        # Write analysis script
        await sandbox.filesystem.write("/app/analyze.py", """
import pandas as pd
import json

df = pd.read_csv('/data/input.csv')
results = {
    'rows': len(df),
    'columns': list(df.columns),
    'summary': df.describe().to_dict()
}

with open('/data/results.json', 'w') as f:
    json.dump(results, f, indent=2)

print('Analysis complete!')
""")
        
        # Run analysis
        result = await sandbox.process.exec("python /app/analyze.py")
        print(result.stdout)
        
        # Download results
        await sandbox.filesystem.download("/data/results.json", "./results.json")
        
        # Read and return results
        content = await sandbox.filesystem.read("/data/results.json")
        return json.loads(content)
```

### Machine Learning Training

```python
async def train_model(data_path: str, epochs: int = 10):
    async with Sandbox.create(
        template="python-torch",
        vcpu=2,
        memory_mb=2048,
        timeout_seconds=3600
    ) as sandbox:
        # Upload training data
        await sandbox.filesystem.upload_dir(data_path, "/data")
        
        # Upload training script
        await sandbox.filesystem.upload("./train.py", "/app/train.py")
        
        # Train with streaming output
        stream = await sandbox.process.exec_stream(
            f"python /app/train.py --epochs {epochs}",
            cwd="/app"
        )
        
        async for event in stream:
            if event.type == "stdout":
                print(event.content, end="")
        
        # Download trained model
        await sandbox.filesystem.download("/app/model.pt", "./model.pt")
```

### Web Scraping

```python
async def scrape_urls(urls: list[str]) -> list[dict]:
    async with Sandbox.create(template="python-web") as sandbox:
        # Install additional packages
        await sandbox.process.exec("pip install beautifulsoup4 lxml")
        
        # Write scraper script
        await sandbox.filesystem.write("/app/scraper.py", f"""
import httpx
from bs4 import BeautifulSoup
import json

urls = {urls}
results = []

for url in urls:
    try:
        resp = httpx.get(url, timeout=10)
        soup = BeautifulSoup(resp.text, 'lxml')
        results.append({{
            'url': url,
            'title': soup.title.string if soup.title else None,
            'status': resp.status_code
        }})
    except Exception as e:
        results.append({{'url': url, 'error': str(e)}})

with open('/app/results.json', 'w') as f:
    json.dump(results, f, indent=2)
""")
        
        # Run scraper
        await sandbox.process.exec("python /app/scraper.py", timeout_seconds=120)
        
        # Get results
        content = await sandbox.filesystem.read("/app/results.json")
        return json.loads(content)
```

### Parallel Processing with Worker Pool

```python
async def parallel_process(items: list[str]) -> list[str]:
    # Create a pool of sandboxes
    pool = await asyncio.gather(*[
        Sandbox.create(template="base") for _ in range(4)
    ])
    
    try:
        # Distribute work across pool
        async def process_item(sandbox, item):
            result = await sandbox.process.exec(f"echo 'Processing: {item}'")
            return result.stdout.strip()
        
        tasks = [
            process_item(pool[i % len(pool)], item)
            for i, item in enumerate(items)
        ]
        
        return await asyncio.gather(*tasks)
    finally:
        # Cleanup all sandboxes
        await asyncio.gather(*(s.destroy() for s in pool))
```

## Requirements

- Python 3.10+
- Docker (for Docker backend)
- Firecracker (optional, for microVM backend)

## License

MIT License - see [LICENSE](LICENSE) for details.

## Contributing

Contributions are welcome! Please feel free to submit a Pull Request.
