Wednesday, November 16, 2022

Sequence To Sequence Learning With Tensorflow & Keras Tutorial Series

Sequence To Sequence Learning With Tensorflow & Keras Tutorial Series


The Seq2Seq Learning Tutorial Series aims to build an Encoder-Decoder Model with Attention. I would like to develop a solution by showing the shortcomings of other possible approaches as well. Therefore, in the first 2 parts, we will observe that initial models have their own weaknesses. We will also understand why the Encoder-Decoder paradigm is so successful. 

You can access all the parts in the below links.


Photo by Clay Banks on Unsplash

PARTS


Part A: AN INTRODUCTION TO SEQ2SEQ LEARNING AND A SAMPLE SOLUTION WITH MLP NETWORK

YouTube Videos in ENGLISH or TURKISH / Post Colab Notebook

Part B: SEQ2SEQ LEARNING WITH RECURRENT NEURAL NETWORKS (LSTM)

YouTube Video in ENGLISH or TURKISH / Post / Colab Notebook

Part C: SEQ2SEQ LEARNING WITH A BASIC ENCODER DECODER MODEL

YouTube Video in ENGLISH or TURKISH  / Post / Colab Notebook

Part D: SEQ2SEQ LEARNING WITH AN ENCODER DECODER MODEL + TEACHER FORCING

YouTube Video in ENGLISH or TURKISH / Post Colab Notebook

Part E: SEQ2SEQ LEARNING WITH AN ENCODER DECODER MODEL WITH TEACHER FORCING FOR VARIABLE INPUT AND OUTPUT SIZE: MASKING & PADDING

YouTube Video in ENGLISH or TURKISH / Post Colab Notebook

Part F: SEQ2SEQ LEARNING WITH AN ENCODER DECODER MODEL + BAHDANAU ATTENTION + LUONG ATTENTION

YouTube Video in ENGLISH or TURKISH / Post Colab Notebook

    I hope you enjoy the Seq2Seq Learning Tutorial Series!

    You can access Murat Karakaya Akademi via:

    YouTube

    Facebook

    Instagram

    LinkedIn

    Github

    Kaggle

    muratkarakaya.net