Metadata-Version: 2.4
Name: branchkey
Version: 2.7.0
Summary: Client application to interface with the BranchKey system
Home-page: https://branchkey.com
Author: BranchKey
Author-email: info@branchkey.com
Project-URL: Homepage, https://branchkey.com
Project-URL: Repository, https://gitlab.com/branchkey/client_application
Classifier: Programming Language :: Python :: 3
Classifier: Programming Language :: Python :: 3.9
Classifier: Programming Language :: Python :: 3.10
Classifier: Programming Language :: Python :: 3.11
Classifier: Programming Language :: Python :: 3.12
Classifier: Programming Language :: Python :: 3.13
Classifier: License :: OSI Approved :: GNU General Public License v3 (GPLv3)
Classifier: Operating System :: OS Independent
Requires-Python: >=3.9
Description-Content-Type: text/markdown
Requires-Dist: requests==2.32.3
Requires-Dist: numpy==1.26.4
Requires-Dist: pika==1.3.2
Requires-Dist: pysocks==1.7.1
Dynamic: author
Dynamic: author-email
Dynamic: classifier
Dynamic: description
Dynamic: description-content-type
Dynamic: home-page
Dynamic: project-url
Dynamic: requires-dist
Dynamic: requires-python
Dynamic: summary

# BranchKey Python Client

![BK_logo](https://branchkey.com/branding/bk-logo-medium.png)

[![PyPI version](https://badge.fury.io/py/branchkey.svg)](https://badge.fury.io/py/branchkey)
[![Python](https://img.shields.io/pypi/pyversions/branchkey.svg)](https://pypi.org/project/branchkey/)
[![License: GPL v3](https://img.shields.io/badge/License-GPLv3-blue.svg)](https://www.gnu.org/licenses/gpl-3.0)

Official Python client for the BranchKey federated learning platform. This library provides a simple interface to upload model weights, download aggregated results, and track training runs.

## Installation

```bash
pip install branchkey
```

**Requirements:** Python 3.9 or higher

## Quick Start

### 1. Get Credentials

Create a leaf entity through the BranchKey platform to obtain credentials via the `/v2/entities` API endpoint:

```python
credentials = {
    "id": "your-leaf-uuid",
    "name": "my-client",
    "session_token": "your-session-token-uuid",
    "owner_id": "your-user-uuid",
    "tree_id": "your-tree-uuid",
    "branch_id": "your-branch-uuid"
}
```

### 2. Initialize Client

```python
from branchkey.client import Client

# Connect to BranchKey
client = Client(credentials, host="https://app.branchkey.com")
```

### 3. Upload Model Weights

```python
import numpy as np

# Prepare model weights: [num_samples, [list of parameter arrays]]
num_samples = 1000
parameters = [layer1_weights, layer2_weights, ...]
model_update = np.array([num_samples, parameters], dtype=object)

# Save and upload
with open("model_weights.npy", "wb") as f:
    np.save(f, model_update)

file_id = client.file_upload("model_weights.npy")
print(f"Uploaded: {file_id}")
```

### 4. Download Aggregated Results

```python
# Check for aggregation notifications
if not client.queue.empty():
    aggregation_id = client.queue.get(block=False)
    client.file_download(aggregation_id)
    # Downloaded to: ./aggregated_files/{aggregation_id}.npy
```

## Configuration Options

```python
client = Client(
    credentials,
    host="https://app.branchkey.com",  # API endpoint
    rbmq_host=None,                     # RabbitMQ host (auto-derived from host)
    rbmq_port=5672,                     # RabbitMQ port
    ssl=True,                           # Verify SSL certificates
    wait_for_run=False,                 # Wait if run is paused
    run_check_interval_s=30,            # Run status check interval
    proxies=None                        # HTTP/HTTPS proxy dict
)
```

## Model Weight Format

Model weights must be NumPy arrays in this format:

```python
[num_samples, [list_of_parameter_arrays]]
```

- `num_samples` (int): Number of training samples
- `list_of_parameter_arrays` (list): List of NumPy arrays with model parameters

### PyTorch Example

```python
# Convert PyTorch model
parameters = []
for name, param in model.named_parameters():
    parameters.append(param.data.cpu().detach().numpy())

model_update = np.array([num_samples, parameters], dtype=object)
```

Or use the built-in helper:

```python
update = client.convert_pytorch_numpy(
    model.named_parameters(),
    weighting=num_samples
)
```

### Required File Format

Weights file in a numpy `.npy` format:

```python
with open("./test.npy", "wb") as f:
    np.save(f, parameter_array)
[num_samples, [n_d parameter matrix]]
```

```
num_samples - the number of samples that contributed to this update
n_d parameter matrix - parameters
```

The required numpy arrays after exports:

```python
[1329, list([array([[[[ 1.71775490e-01,    [[[ 8.74867663e-02,  5.19692302e-02, -1.64664671e-01,,          -2.23452481e-03,  1.11475676e-01],,    [-1.75505821e-02, -1...
```

```python
(1329, [array([[[[ 1.71775490e-01,  3.02851666e-02,  2.90171858e-02,
          -4.27578250e-03,  1.14474617e-01],
         [-8.07138346e-03,  1.44909814e-01, -5.36724664e-02,
          -3.51673253e-02, -1.82426855e-01],
         [ 6.75795972e-02, -1.72839850e-01, -7.25025982e-02,
          -1.59504730e-02,  1.60634145e-01],
         [ 6.62277341e-02, -2.26575769e-02, -1.65369093e-01,
          -8.67117420e-02,  1.80021569e-01],
         [-6.11407161e-02, -1.59245610e-01,  1.45820528e-01,
          -5.40512279e-02, -5.19061387e-02]]],
        ....
         [-1.44068539e-01,  6.15987852e-02,  1.83321223e-01,
          -1.79076958e-02, -1.53445438e-01],
         [-7.76787996e-02,  7.64556080e-02,  9.43044946e-02,
           1.63337544e-01, -1.69042274e-01],
         [-8.55994076e-02, -1.23661250e-01,  1.48442864e-01,
          -1.35983482e-01,  2.05254350e-02]]]], dtype=float32), array([ 0.13065006,  0.12797254, -0.12818147, -0.09621437,  0.04100017,
       -0.07248228,  0.02753541,  0.00476395, -0.11270998,  0.11353076,
       -0.0167569 ,  0.12654744, -0.05019006, -0.07281244,  0.03892357,
       -0.09698197, -0.06845284, -0.04604543, -0.01372138, -0.052395  ,
        0.04833373,  0.16228785,  0.09982517,  0.19556762,  0.10631064,
        0.02496212, -0.14297573, -0.10442089,  0.01970248, -0.1684099 ,
       -0.05076171,  0.19325127], dtype=float32), array([[[[-3.42470817e-02,  8.76816106e-04, -2.13724039e-02,
          -2.62880027e-02, -1.86583996e-02],
         [ 2.56936941e-02, -1.97169576e-02, -3.45735364e-02,
          -4.32738848e-03, -1.22306980e-02],
         [ 8.36322457e-03,  3.26042138e-02, -1.50063485e-02,
          -1.85401291e-02,  2.39207298e-02],
         [-1.15280924e-02, -3.47947963e-02,  2.17274204e-02,
           1.80862695e-02,  2.19682772e-02],
...
etc
```

## Performance Metrics

Submit training or testing metrics:

```python
import json

metrics = {"accuracy": 0.95, "loss": 0.12}
client.send_performance_metrics(
    aggregation_id="aggregation-uuid",
    data=json.dumps(metrics),
    mode="test"  # "test", "train", or "non-federated"
)
```

## Client Properties

```python
client.run_status        # Current run status: "start", "stop", or "pause"
client.run_number        # Current run iteration
client.leaf_id           # Your leaf UUID
client.branch_id         # Parent branch UUID
client.is_authenticated  # Authentication status
```

## Proxy Support

For networks requiring proxy access:

```python
proxies = {
    'http': 'http://user:password@proxy.example.com:8080',
    'https': 'http://user:password@proxy.example.com:8080',
}
client = Client(credentials, host="https://app.branchkey.com", proxies=proxies)
```

## Development

### Running Tests

```bash
# Clone repository
git clone https://gitlab.com/branchkey/client_application.git
cd client_application

# Run tests with Docker (requires Docker)
make local-test

# Or manually
python3 -m venv venv
source venv/bin/activate
pip install -r requirements.txt
python -m unittest -v
```

## Support

- **Website**: [https://branchkey.com](https://branchkey.com)
- **Repository**: [https://gitlab.com/branchkey/client_application](https://gitlab.com/branchkey/client_application)
- **Email**: info@branchkey.com

---

**BranchKey** - Federated Learning Platform
