Metadata-Version: 2.1
Name: mlgw-bns
Version: 0.11.0a1
Summary: Accelerating gravitational wave template generation with machine learning.
Home-page: https://github.com/jacopok/mlgw_bns
License: GNU GPL3
Keywords: python,gravitational-waves,scientific
Author: Jacopo Tissino
Author-email: jacopo@tissino.it
Requires-Python: >=3.7,<3.10
Classifier: License :: Other/Proprietary License
Classifier: Programming Language :: Python :: 3
Classifier: Programming Language :: Python :: 3.7
Classifier: Programming Language :: Python :: 3.8
Classifier: Programming Language :: Python :: 3.9
Provides-Extra: docs
Provides-Extra: torch
Requires-Dist: MarkupSafe (==2.0.1); extra == "docs"
Requires-Dist: PyCBC (==2.0.2)
Requires-Dist: Sphinx (>=4.3.1,<5.0.0); extra == "docs"
Requires-Dist: h5py (>=3.6.0,<4.0.0)
Requires-Dist: joblib (>=1.1.0,<2.0.0)
Requires-Dist: myst-parser (>=0.15.2,<0.16.0); extra == "docs"
Requires-Dist: numba (>=0.55.0,<0.56.0)
Requires-Dist: numpy (>=1.18,<1.23)
Requires-Dist: optuna (>=2.10.0,<3.0.0)
Requires-Dist: plotly (>=5.5.0,<6.0.0)
Requires-Dist: readthedocs-sphinx-search (>=0.1.1,<0.2.0); extra == "docs"
Requires-Dist: scikit-learn (>=1.0.1,<2.0.0)
Requires-Dist: sortedcontainers (>=2.4.0,<3.0.0)
Requires-Dist: sphinx-rtd-theme (>=1.0.0,<2.0.0); extra == "docs"
Requires-Dist: toml (>=0.10.2,<0.11.0)
Requires-Dist: torch (>=1.10.2,<2.0.0); extra == "torch"
Requires-Dist: tqdm (>=4.62.3,<5.0.0)
Requires-Dist: types-setuptools (>=57.4.7,<58.0.0)
Project-URL: Repository, https://github.com/jacopok/mlgw_bns
Description-Content-Type: text/markdown

[![CI Pipeline for mlgw_bns](https://github.com/jacopok/mlgw_bns/actions/workflows/ci.yaml/badge.svg)](https://github.com/jacopok/mlgw_bns/actions/workflows/ci.yaml)
[![Documentation Status](https://readthedocs.org/projects/mlgw-bns/badge/?version=latest)](https://mlgw-bns.readthedocs.io/en/latest/?badge=latest)
[![PyPI version](https://badge.fury.io/py/mlgw-bns.svg)](https://badge.fury.io/py/mlgw-bns)
[![Code style: black](https://img.shields.io/badge/code%20style-black-000000.svg)](https://github.com/psf/black)
[![Coverage Status](https://coveralls.io/repos/github/jacopok/mlgw_bns/badge.svg?branch=master)](https://coveralls.io/github/jacopok/mlgw_bns?branch=master)
[![Downloads](https://pepy.tech/badge/mlgw-bns/week)](https://pepy.tech/project/mlgw-bns)

# Machine Learning for Gravitational Waves from Binary Neutron Star mergers

This package's purpose is to speed up the generation of template gravitational waveforms for binary neutron star mergers by training a machine learning model on a dataset of waveforms generated with some physically-motivated surrogate.

It is able to reconstruct them with mismatches lower than 1/10000,
with as little as 1000 training waveforms; 
the accuracy then steadily improves as more training waveforms are used.

Currently, the only model used for training is [`TEOBResumS`](http://arxiv.org/abs/1806.01772),
but it is planned to introduce the possibility to use others.

The documentation can be found [here](https://mlgw-bns.readthedocs.io/en/latest).

<!-- ![dependencygraph](mlgw_bns.svg) -->

## Installation

To install the package, use
```bash
pip install mlgw-bns
```

For more details see [the documentation](https://mlgw-bns.readthedocs.io/en/latest/usage_guides/install.html).

## Changelog

Changes across versions are documented since version 0.10.1 in the [CHANGELOG](https://github.com/jacopok/mlgw_bns/blob/master/CHANGELOG.md).

## Inner workings

The main steps taken by `mlgw_bns` to train on a dataset are as follows:

- generate the dataset, consisting of EOB waveforms
- decompose the Fourier transforms of the waveforms into phase and amplitude
- downsample the dataset to a few thousand points
- compute the residuals of the EOB waveforms from PN ones
- apply a PCA to reduce the dimensionality to a few tens of real numbers
- train a neural network on the relation
    between the waveform parameters and the PCA components
    
After this, the model can reconstruct a waveform within its parameter space,
resampled at arbitrary points in frequency space.

In several of the training steps data-driven optimizations are performed:

- the points at which the waveforms are downsampled are not uniformly chosen:
    instead, a greedy downsampling algorithm determines them
- the hyperparameters for the neural network are optimized, according to both
    the time taken for the training and the estimated reconstruction error, 
    also varying the number of training waveforms available. 
    
