Metadata-Version: 2.1
Name: turbinia
Version: 20240313
Summary: Automation and Scaling of Digital Forensics Tools
Home-page: https://github.com/google/turbinia
License: Apache-2.0
Author: Turbinia Developers
Author-email: turbinia-dev@googlegroups.com
Maintainer: Turbinia Developers
Maintainer-email: turbinia-dev@googlegroups.com
Requires-Python: >=3.10,<4.0
Classifier: License :: OSI Approved :: Apache Software License
Classifier: Programming Language :: Python :: 3
Classifier: Programming Language :: Python :: 3.10
Classifier: Programming Language :: Python :: 3.11
Classifier: Programming Language :: Python :: 3.12
Provides-Extra: worker
Requires-Dist: backoff (>=2.2.1)
Requires-Dist: celery (>=5.2.2,<6.0.0)
Requires-Dist: dfDewey (>=20220603,<20220604) ; extra == "worker"
Requires-Dist: dfimagetools (>=20230806,<20230807) ; extra == "worker"
Requires-Dist: docker (>=6.1.3,<7.0.0)
Requires-Dist: fastapi[all] (>=0.75.0,<0.99.0)
Requires-Dist: filelock
Requires-Dist: google-api-core (<3.0.0)
Requires-Dist: google-generativeai (>=0.3.2)
Requires-Dist: libcloudforensics (==20240214)
Requires-Dist: pandas (>=2.1.0,<3.0.0)
Requires-Dist: plaso (==20240308) ; extra == "worker"
Requires-Dist: prometheus_client (>=0.17.1,<0.18.0)
Requires-Dist: protobuf (>=3.19.0)
Requires-Dist: pydantic (>=1.10.5,<2.0.0)
Requires-Dist: pyglove (>=0.4.4)
Requires-Dist: pyhindsight (>=20230327.0,<20230328.0) ; extra == "worker"
Requires-Dist: ratelimit (>=2.2.1)
Requires-Dist: redis (>=4.4.4,<5.0.0)
Requires-Dist: urllib3 (>=1.25.4,<1.27) ; python_version < "3.10"
Requires-Dist: urllib3 (>=1.25.4,<2.1) ; python_version >= "3.10"
Project-URL: Documentation, https://turbinia.readthedocs.io/en/latest/
Project-URL: Repository, https://github.com/google/turbinia
Description-Content-Type: text/markdown

# Turbinia

## Summary

Turbinia is an open-source framework for deploying, managing, and running
distributed forensic workloads. It is intended to automate running of common
forensic processing tools (i.e. Plaso, TSK, strings, etc) to help with
processing evidence in the Cloud, scaling the processing of large amounts of
evidence, and decreasing response time by parallelizing processing where
possible.

<img src="docs/images/turbinia-logo.jpg?raw=true" width=240>

## How it works

Turbinia is composed of different components for the client, server and the
workers. These components can be run in the Cloud, on local machines, or as a
hybrid of both. The Turbinia client makes requests to process evidence to the
Turbinia server. The Turbinia server creates logical jobs from these incoming
user requests, which creates and schedules forensic processing tasks to be run
by the workers. The evidence to be processed will be split up by the jobs when
possible, and many tasks can be created in order to process the evidence in
parallel. One or more workers run continuously to process tasks from the server.
Any new evidence created or discovered by the tasks will be fed back into
Turbinia for further processing.

Communication from the client to the server is currently done with either Google
Cloud PubSub or [Kombu](https://github.com/celery/kombu) messaging. The worker
implementation can use either [PSQ](https://github.com/GoogleCloudPlatform/psq)
(a Google Cloud PubSub Task Queue) or [Celery](http://www.celeryproject.org/)
for task scheduling.

The main documentation for Turbinia can be
[found here](https://turbinia.readthedocs.io/). You can also find out more about
the architecture and
[how it works here](https://turbinia.readthedocs.io/en/latest/user/how-it-works.html).

## Status

Turbinia is currently in Alpha release.

## Installation

There is an
[installation guide here](https://turbinia.readthedocs.io/en/latest/user/install.html).

## Usage

The basic steps to get things running after the initial installation and
configuration are:

*   Start Turbinia server component with `turbiniactl server` command
*   Start Turbinia API server component with `turbiniactl api_server` command if using Celery
*   Start one or more Turbinia workers with `turbiniactl celeryworker` if using Celery, or `turbiniactl psqworker` if using PSQ
*   Install `turbinia-client` via `pip install turbinia-client`
*   Send evidence to be processed from the turbinia client with `turbinia-client submit ${evidencetype}`
*   Check status of running tasks with `turbinia-client status`

turbinia-client can be used to interact with Turbinia through the API server component, and here is the basic
usage:

```
$ turbinia-client -h
Usage: turbinia-client [OPTIONS] COMMAND [ARGS]...

  Turbinia API command-line tool (turbinia-client).

                          ***    ***
                           *          *
                      ***             ******
                     *                      *
                     **      *   *  **     ,*
                       *******  * ********
                              *  * *
                              *  * *
                              %%%%%%
                              %%%%%%
                     %%%%%%%%%%%%%%%       %%%%%%
               %%%%%%%%%%%%%%%%%%%%%      %%%%%%%
  %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%  ** *******
  %%                                                   %%  ***************
  %%                                (%%%%%%%%%%%%%%%%%%%  *****  **
    %%%%%        %%%%%%%%%%%%%%%
    %%%%%%%%%%                     %%          **             ***
       %%%                         %%  %%             %%%           %%%%,
       %%%      %%%   %%%   %%%%%  %%%   %%%   %%  %%%   %%%  %%%       (%%
       %%%      %%%   %%%  %%%     %%     %%/  %%  %%%   %%%  %%%  %%%%%%%%
       %%%      %%%   %%%  %%%     %%%   %%%   %%  %%%   %%%  %%% %%%   %%%
       %%%        %%%%%    %%%       %%%%%     %%  %%%    %%  %%%   %%%%%

  This command-line tool interacts with Turbinia's API server.

  You can specify the API server location in ~/.turbinia_api_config.json

Options:
  -c, --config_instance TEXT  A Turbinia instance configuration name.
                              [default: (dynamic)]
  -p, --config_path TEXT      Path to the .turbinia_api_config.json file..
                              [default: (dynamic)]
  -h, --help                  Show this message and exit.

Commands:
  config    Get Turbinia configuration.
  evidence  Get or upload Turbinia evidence.
  jobs      Get a list of enabled Turbinia jobs.
  result    Get Turbinia request or task results.
  status    Get Turbinia request or task status.
  submit    Submit new requests to the Turbinia API server.
```

Check out the `turbinia-client` documentation [page](https://turbinia.readthedocs.io/en/latest/user/turbinia-client.html#turbinia-api-cli-tool-turbinia-client) for a detailed user guide.

You can also interact with Turbinia directly from Python by using the API library. We provide some examples [here](https://github.com/google/turbinia/tree/master/turbinia/api/client)

## Other documentation

*   [Main Documentation](https://turbinia.readthedocs.io)
*   [Installation](https://turbinia.readthedocs.io/en/latest/user/install.html)
*   [How it works](https://turbinia.readthedocs.io/en/latest/user/how-it-works.html)
*   [Operational Details](https://turbinia.readthedocs.io/en/latest/user/operational-details.html)
*   [Turbinia client CLI tool](https://turbinia.readthedocs.io/en/latest/user/turbinia-client.html#turbinia-api-cli-tool-turbinia-client)
*   [Turbinia API server](https://turbinia.readthedocs.io/en/latest/user/api-server.html)
*   [Turbinia Python API library](https://github.com/google/turbinia/tree/master/turbinia/api/client)
*   [Contributing to Turbinia](https://turbinia.readthedocs.io/en/latest/developer/contributing.html)
*   [Developing new Tasks](https://turbinia.readthedocs.io/en/latest/developer/developing-new-tasks.html)
*   [FAQ](https://turbinia.readthedocs.io/en/latest/user/faq.html)
*   [Debugging and Common Errors](https://turbinia.readthedocs.io/en/latest/user/debugging.html)
*   [Using Docker to execute jobs](https://turbinia.readthedocs.io/en/latest/user/using-docker.html)

##### Obligatory Fine Print

This is not an official Google product (experimental or otherwise), it is just
code that happens to be owned by Google.

