The Transformer Positional Encoding Layer in Keras, Part 2

In part 1, a gentle introduction to positional encoding in transformer models, we discussed the positional encoding layer of the transformer model. We also showed how you could implement this layer and its functions yourself in Python. In this tutorial, you’ll implement the positional encoding layer in Keras and Tensorflow. You can then use this layer in a complete transformer model.

After completing this tutorial, you will know:

  • Text vectorization in Keras
  • Embedding layer in Keras
  • How to subclass the embedding layer and write your own positional encoding layer.

Kick-start your project with my book Building Transformer Models with Attention. It provides self-study tutorials with working code to guide you into building a fully-working transformer model that

 

 

To finish reading, please visit source site