Metadata-Version: 2.1
Name: openllm-client
Version: 0.4.1
Summary: OpenLLM Client: Interacting with OpenLLM HTTP/gRPC server, or any BentoML server.
Project-URL: Blog, https://modelserving.com
Project-URL: Chat, https://discord.gg/openllm
Project-URL: Documentation, https://github.com/bentoml/OpenLLM/blob/main/openllm-client/README.md
Project-URL: GitHub, https://github.com/bentoml/OpenLLM/blob/main/openllm-client
Project-URL: History, https://github.com/bentoml/OpenLLM/blob/main/CHANGELOG.md
Project-URL: Homepage, https://bentoml.com
Project-URL: Tracker, https://github.com/bentoml/OpenLLM/issues
Project-URL: Twitter, https://twitter.com/bentomlai
Author-email: Aaron Pham <aarnphm@bentoml.com>, BentoML Team <contact@bentoml.com>
License-Expression: Apache-2.0
License-File: LICENSE.md
Keywords: AI,Alpaca,BentoML,Falcon,Fine tuning,Generative AI,LLMOps,Large Language Model,Llama 2,MLOps,Model Deployment,Model Serving,PyTorch,Serverless,StableLM,Transformers,Vicuna
Classifier: Development Status :: 5 - Production/Stable
Classifier: Environment :: GPU :: NVIDIA CUDA
Classifier: Environment :: GPU :: NVIDIA CUDA :: 11.7
Classifier: Environment :: GPU :: NVIDIA CUDA :: 11.8
Classifier: Environment :: GPU :: NVIDIA CUDA :: 12
Classifier: Intended Audience :: Developers
Classifier: Intended Audience :: Science/Research
Classifier: Intended Audience :: System Administrators
Classifier: License :: OSI Approved :: Apache Software License
Classifier: Operating System :: OS Independent
Classifier: Programming Language :: Python
Classifier: Programming Language :: Python :: 3
Classifier: Programming Language :: Python :: 3 :: Only
Classifier: Programming Language :: Python :: 3.8
Classifier: Programming Language :: Python :: 3.9
Classifier: Programming Language :: Python :: 3.10
Classifier: Programming Language :: Python :: 3.11
Classifier: Programming Language :: Python :: Implementation :: CPython
Classifier: Programming Language :: Python :: Implementation :: PyPy
Classifier: Topic :: Scientific/Engineering
Classifier: Topic :: Scientific/Engineering :: Artificial Intelligence
Classifier: Topic :: Software Development :: Libraries
Classifier: Typing :: Typed
Requires-Python: >=3.8
Requires-Dist: attrs>=23.1.0
Requires-Dist: cattrs>=23.1.0
Requires-Dist: httpx
Requires-Dist: orjson
Provides-Extra: agents
Requires-Dist: diffusers; extra == 'agents'
Requires-Dist: soundfile; extra == 'agents'
Requires-Dist: transformers[agents]>=4.30; extra == 'agents'
Provides-Extra: full
Requires-Dist: openllm-client[agents,grpc]; extra == 'full'
Provides-Extra: grpc
Requires-Dist: bentoml[grpc]>=1.1.6; extra == 'grpc'
Description-Content-Type: text/markdown

<p align="center">
  <a href="https://github.com/bentoml/openllm">
    <img src="https://raw.githubusercontent.com/bentoml/openllm/main/.github/assets/main-banner.png" alt="Banner for OpenLLM" />
  </a>
</p>


<div align="center">
    <h1 align="center">👾 OpenLLM Client</h1>
    <a href="https://pypi.org/project/openllm-client">
        <img src="https://img.shields.io/pypi/v/openllm-client.svg?logo=pypi&label=PyPI&logoColor=gold" alt="pypi_status" />
    </a><a href="https://test.pypi.org/project/openllm-client/">
        <img src="https://img.shields.io/badge/Nightly-PyPI?logo=pypi&label=PyPI&color=gray&link=https%3A%2F%2Ftest.pypi.org%2Fproject%2Fopenllm%2F" alt="test_pypi_status" />
    </a><a href="https://twitter.com/bentomlai">
        <img src="https://badgen.net/badge/icon/@bentomlai/1DA1F2?icon=twitter&label=Follow%20Us" alt="Twitter" />
    </a><a href="https://l.bentoml.com/join-openllm-discord">
        <img src="https://badgen.net/badge/icon/OpenLLM/7289da?icon=discord&label=Join%20Us" alt="Discord" />
    </a><a href="https://github.com/bentoml/OpenLLM/actions/workflows/ci.yml">
        <img src="https://github.com/bentoml/OpenLLM/actions/workflows/ci.yml/badge.svg?branch=main" alt="ci" />
    </a><a href="https://results.pre-commit.ci/latest/github/bentoml/OpenLLM/main">
        <img src="https://results.pre-commit.ci/badge/github/bentoml/OpenLLM/main.svg" alt="pre-commit.ci status" />
    </a><br>
    <a href="https://pypi.org/project/openllm-client">
        <img src="https://img.shields.io/pypi/pyversions/openllm-client.svg?logo=python&label=Python&logoColor=gold" alt="python_version" />
    </a><a href="htjtps://github.com/pypa/hatch">
        <img src="https://img.shields.io/badge/%F0%9F%A5%9A-Hatch-4051b5.svg" alt="Hatch" />
    </a><a href="https://github.com/bentoml/OpenLLM/blob/main/STYLE.md">
        <img src="https://img.shields.io/badge/code%20style-experimental-000000.svg" alt="code style" />
    </a><a href="https://github.com/astral-sh/ruff">
        <img src="https://img.shields.io/endpoint?url=https://raw.githubusercontent.com/charliermarsh/ruff/main/assets/badge/v2.json" alt="Ruff" />
    </a><a href="https://github.com/python/mypy">
        <img src="https://img.shields.io/badge/types-mypy-blue.svg" alt="types - mypy" />
    </a><a href="https://github.com/microsoft/pyright">
        <img src="https://img.shields.io/badge/types-pyright-yellow.svg" alt="types - pyright" />
    </a><br>
    <p>OpenLLM Client: Interacting with OpenLLM HTTP/gRPC server, or any BentoML server.<br/></p>
    <i></i>
</div>

## 📖 Introduction

With OpenLLM, you can run inference with any open-source large-language models,
deploy to the cloud or on-premises, and build powerful AI apps, and more.

To learn more about OpenLLM, please visit <a href="https://github.com/bentoml/OpenLLM">OpenLLM's README.md</a>

This package holds the underlying client implementation for OpenLLM. If you are
coming from OpenLLM, the client can be accessed via `openllm.client`.

It provides somewhat of a "similar" APIs to [`bentoml.Client`](https://docs.bentoml.com/en/latest/guides/client.html)
(via `openllm_client.min`) for interacting with OpenLLM server. This can also be extended to use with general
BentoML server as well.

> [!NOTE]
> The component of interop with generic BentoML server will be considered as _EXPERIMENTAL_ and
> will be refactored to new client implementation soon!
> If you are just using this package for interacting with OpenLLM server, The API should be the same as `openllm.client` namespace.

```python
import openllm

client = openllm.client.HTTPClient()

client.query('Explain to me the difference between "further" and "farther"')
```

<p align="center">
  <img src="https://raw.githubusercontent.com/bentoml/openllm/main/.github/assets/output.gif" alt="Gif showing OpenLLM Intro" />
</p>

<p align="center">
  <img src="https://raw.githubusercontent.com/bentoml/openllm/main/.github/assets/agent.gif" alt="Gif showing Agent integration" />
</p>

## 📔 Citation

If you use OpenLLM in your research, we provide a [citation](../CITATION.cff) to use:

```bibtex
@software{Pham_OpenLLM_Operating_LLMs_2023,
author = {Pham, Aaron and Yang, Chaoyu and Sheng, Sean and  Zhao, Shenyang and Lee, Sauyon and Jiang, Bo and Dong, Fog and Guan, Xipeng and Ming, Frost},
license = {Apache-2.0},
month = jun,
title = {{OpenLLM: Operating LLMs in production}},
url = {https://github.com/bentoml/OpenLLM},
year = {2023}
}
```

---

[Click me for full changelog](https://github.com/bentoml/openllm/blob/main/CHANGELOG.md)
