Metadata-Version: 2.1
Name: aviv-aws-costexplorer
Version: 0.1.3
Summary: Aviv AWS CostExplorer python library
Home-page: https://github.com/aviv-group/aviv-aws-costexplorer
Author: Jules Clement
Author-email: jules.clement@aviv-group.com
License: UNKNOWN
Platform: UNKNOWN
Classifier: Development Status :: 4 - Beta
Classifier: Intended Audience :: Developers
Classifier: Intended Audience :: Information Technology
Classifier: Intended Audience :: System Administrators
Classifier: License :: OSI Approved :: MIT License
Classifier: Programming Language :: Python :: 3 :: Only
Classifier: Programming Language :: Python :: 3.8
Classifier: Operating System :: OS Independent
Classifier: Topic :: Utilities
Classifier: Topic :: Office/Business
Classifier: Topic :: Office/Business :: Financial
Classifier: Topic :: Office/Business :: Financial :: Accounting
Classifier: Topic :: System
Classifier: Topic :: System :: Logging
Classifier: Topic :: System :: Monitoring
Classifier: Topic :: System :: Systems Administration
Classifier: Typing :: Typed
Requires-Python: >=3.8
Description-Content-Type: text/markdown
Provides-Extra: datastore
License-File: LICENSE

# Aviv AWS CostExplorer

Aims to provide a quick and comprehensive interface to AWS costexplorer api.
This is useful to extract cost and usage (aka CAU) data, save it and to make it available for reporting and analysis.

## Requirements

- python >= 3.8
- boto3
- Access to AWS ce:cost_and_usage

## Usage

```shell
pip install aviv-aws-costexplorer

# Install additional libraries required to save/read data on AWS S3: pandas, awswrangler
pip install aviv-aws-costexplorer[datastore]
```

Sample code

```python
from aviv_aws_costexplorer import costreporter

cr = costreporter.CostReporter()
costs = cr.get_cost_and_usage()
# Will print you last 3 months costs
print(costs)

from aviv_aws_costexplorer import datastore
ds = datastore.DataStore(database='test', bucket='my-s3-bucket')
ds.to_parquet(data=costs, path='monthly/last3', database='monthly')

# Show it nicely
import pandas as pd
df = pd.DataFrame(costs)
df.head()


# Store on S3 and make it available through Athena (uses awswrangler)

```

## Development

```bash
pipenv install -d
```

## Test, Build, Release

```bash
# Run tests
pipenv run pytest -v tests/

# Build python package
python3 setup.py sdist bdist_wheel

# Release on testpypi
python3 -m twine upload --repository testpypi dist/*
```

Note: the Pypi release is also done during the CICD process.

## Contribute

Yes please! Send us your PR's


