pip install keras-self-attention==0.50.0
Attention mechanism for processing sequential data that considers the context for each timestamp
SourceAmong top 2% packages on PyPI.
Over 135.7K downloads in the last 90 days.
keras-self-attention
Based on how often these packages appear together in public
requirements.txt
files on GitHub.
Transformer implemented in Keras |
|
Attention mechanism for processing sequential data that considers the context for each timestamp |
|
Feed forward layer implemented in Keras |
|
Calculate similarity with embedding |
|
Position embedding layers in Keras |
|
A wrapper layer for stacking layers horizontally |
|
A Python API for managing genotype-phenotype map data |
|
Use evolutionary algorithms instead of gridsearch in scikit-learn. |
|
A library for Probabilistic Graphical Models |
|
A Tasty Python Binding with MeCab(FFI-based, no SWIG or compiler necessary) |
|
None |
|
Reasoning on the response of logical signaling networks with Answer Set Programming |
|
Python implementation of Portable Format for Analytics (PFA): producer, converter, and consumer. |
|
Upload data to a Midas Server application with Python. |
|
Annotated Data. |
|
The electronic structure package for quantum computers. |
|
Lightweight pipelining: using Python functions as pipeline jobs. |
|
Cython implementation of Python's collections.OrderedDict |
|
SMAC3, a Python implementation of 'Sequential Model-based Algorithm Configuration'. |
keras-self-attention
Proportion of downloaded versions in the last 3 months (only versions over 1%).
0.50.0 |
86.18% |
0.46.0 |
9.42% |