Sequence To Sequence Learning
With Tensorflow & Keras Tutorial Series
This is the Index page of the “SEQ2SEQ Learning in Deep Learning with TensorFlow & Keras” tutorial series.
You can access all the content of the series in English and Turkish as YouTube videos, Medium posts, and Collab / GitHub Jupyter Notebooks using the below links..
Last Updated: 27 May 2021
You can access all my SEQ2SEQ Learning videos on Murat Karakaya Akademi Youtube channel in ENGLISH or in TURKISH
You can access all tutorial on the kmkarakaya Medium blog.
All the codes are shared on the kmkarakaya GitHub Pages.
Here is the list of the tutorials:
Part A: AN INTRODUCTION TO SEQ2SEQ LEARNING AND A SAMPLE SOLUTION WITH MLP NETWORK
- YouTube Videos in ENGLISH or TURKISH / Medium Post / Colab Notebook
Part B: SEQ2SEQ LEARNING WITH RECURRENT NEURAL NETWORKS (LSTM)
- YouTube Video in ENGLISH or TURKISH / Medium Post / Colab Notebook
Part C: SEQ2SEQ LEARNING WITH A BASIC ENCODER DECODER MODEL
- YouTube Video in ENGLISH or TURKISH/ Medium Post / Colab Notebook
Part D: SEQ2SEQ LEARNING WITH AN ENCODER DECODER MODEL WITH TEACHER FORCING
- YouTube Video in ENGLISH or TURKISH / Medium Post / Colab Notebook
Part E: SEQ2SEQ LEARNING WITH AN ENCODER DECODER MODEL WITH TEACHER FORCING FOR VARIABLE INPUT AND OUTPUT SIZE: MASKING & PADDING
- YouTube Video in ENGLISH or TURKISH/ Medium Post / Colab Notebook
Part F: SEQ2SEQ LEARNING WITH AN ENCODER DECODER MODEL + BAHDANAU & LUONG ATTENTION
- YouTube Video in ENGLISH or TURKISH/ Medium Post / Colab Notebook
Part G: To be continued! :)
WHY WE HAVE SO MANY PARTS?
- Our aim is to code an Encoder-Decoder Model with Attention.
- However, I would like to develop the solution by showing the shortcomings of other possible approaches.
- Therefore, in the first 2 parts, we will observe that initial models have their own weaknesses.
- We also understand why the Encoder-Decoder paradigm is so successful.
So, please patiently follow the parts as we develop a better solution :)