Module fast_transformers.recurrent

Implementations of transformers as recurrent functions.

Expand source code
#
# Copyright (c) 2020 Idiap Research Institute, http://www.idiap.ch/
# Written by Angelos Katharopoulos <angelos.katharopoulos@idiap.ch>,
# Apoorv Vyas <avyas@idiap.ch>
#

"""Implementations of transformers as recurrent functions."""

Sub-modules

fast_transformers.recurrent.attention

Implementations of different types of autoregressive attention mechanisms for self attention and cross attention.

fast_transformers.recurrent.transformers

Implement transformer encoders and decoders as RNNs that will be used with different recurrent attention mechanisms …