Metadata-Version: 2.1
Name: AttentionLSTM
Version: 0.1
Summary: Combining LSTM with attention
Home-page: https://github.com/AtrCheema/AttentionLSTM
Author: Ather Abbas
Author-email: ather_abbas786@yahoo.com
Classifier: Development Status :: 4 - Beta
Classifier: Natural Language :: English
Classifier: Intended Audience :: Science/Research
Classifier: Intended Audience :: Developers
Classifier: Operating System :: Microsoft :: Windows
Classifier: Operating System :: POSIX :: Linux
Classifier: Programming Language :: Python :: 3
Classifier: Programming Language :: Python :: 3.7
Classifier: Programming Language :: Python :: 3.8
Classifier: Programming Language :: Python :: 3.8
Classifier: Programming Language :: Python :: 3.9
Description-Content-Type: text/markdown
License-File: LICENCE

[![Documentation Status](https://readthedocs.org/projects/attentionlstm/badge/?version=latest)](https://attentionlstm.readthedocs.io/en/latest/?badge=latest)

comibining SelfAttention mechnism with LSTM

```python
import numpy as np
from tensorflow.keras.models import Model
from tensorflow.keras.layers import Input, Dense
from atten_lstm import AttentionLSTM
seq_len = 20
num_inputs = 2
inp = Input(shape=(seq_len, num_inputs))
outs = AttentionLSTM(num_inputs, 16)(inp)
outs = Dense(1)(outs)

model = Model(inputs=inp, outputs=outs)
model.compile(loss="mse")

print(model.summary())
# define input
x = np.random.random((100, seq_len, num_inputs))
y = np.random.random((100, 1))
h = model.fit(x=x, y=y)
```

For more comprehensive illustration see [examples](https://attentionlstm.readthedocs.io/en/latest/auto_examples/index.html)
