pip install keras-self-attention==0.50.0

Attention mechanism for processing sequential data that considers the context for each timestamp

Source
Among top 2% packages on PyPI.
Over 135.7K downloads in the last 90 days.

Commonly used with keras-self-attention

Based on how often these packages appear together in public requirements.txt files on GitHub.

keras-transformer

Transformer implemented in Keras

keras-self-attention

Attention mechanism for processing sequential data that considers the context for each timestamp

keras-position-wise-feed-forward

Feed forward layer implemented in Keras

keras-embed-sim

Calculate similarity with embedding

keras-pos-embd

Position embedding layers in Keras

keras-multi-head

A wrapper layer for stacking layers horizontally

gpmap

A Python API for managing genotype-phenotype map data

sklearn-deap2

Use evolutionary algorithms instead of gridsearch in scikit-learn.

pgmpy

A library for Probabilistic Graphical Models

natto-py

A Tasty Python Binding with MeCab(FFI-based, no SWIG or compiler necessary)

pyCTS

None

caspo

Reasoning on the response of logical signaling networks with Answer Set Programming

titus

Python implementation of Portable Format for Analytics (PFA): producer, converter, and consumer.

pydas

Upload data to a Midas Server application with Python.

anndata

Annotated Data.

openfermion

The electronic structure package for quantum computers.

joblib

Lightweight pipelining: using Python functions as pipeline jobs.

cyordereddict

Cython implementation of Python's collections.OrderedDict

smac

SMAC3, a Python implementation of 'Sequential Model-based Algorithm Configuration'.

Version usage of keras-self-attention

Proportion of downloaded versions in the last 3 months (only versions over 1%).

0.50.0

86.18%

0.46.0

9.42%