Showing posts with label Seq2Seq. Show all posts
Showing posts with label Seq2Seq. Show all posts

Wednesday, November 16, 2022

Sequence To Sequence Learning With Tensorflow & Keras Tutorial Series

Sequence To Sequence Learning With Tensorflow & Keras Tutorial Series


The Seq2Seq Learning Tutorial Series aims to build an Encoder-Decoder Model with Attention. I would like to develop a solution by showing the shortcomings of other possible approaches as well. Therefore, in the first 2 parts, we will observe that initial models have their own weaknesses. We will also understand why the Encoder-Decoder paradigm is so successful. 

You can access all the parts in the below links.


Photo by Clay Banks on Unsplash

Thursday, November 10, 2022

Seq2Seq Learning Part A: Introduction & A Sample Solution with MLP Network

 

Seq2Seq Learning Part A: Introduction & A Sample Solution with MLP Network

If you are interested in Seq2Seq Learning, I have good news for you. Recently, I have been working on Seq2Seq Learning and I decided to prepare a series of tutorials about Seq2Seq Learning from a simple Multi-Layer Perceptron Neural Network model to an Encoder-Decoder Model with Attention.

You can access all my SEQ2SEQ Learning videos on Murat Karakaya Akademi Youtube channel in ENGLISH or in TURKISH

You can access all the tutorials in this series from my blog on www.muratkarakaya.net

Thank you!



Photo by Hal Gatewood on Unsplash

Seq2Seq Learning Part B: Using the LSTM layer in a Recurrent Neural Network

 

SEQ2SEQ LEARNING Part B: Using the LSTM layer in a Recurrent Neural Network

Welcome to the Part B of the Seq2Seq Learning Tutorial Series. In this tutorial, we will use several Recurrent Neural Network models to solve the sample Seq2Seq problem introduced in Part A.

We will use LSTM as the Recurrent Neural Network layer in Keras.

You can access all my SEQ2SEQ Learning videos on Murat Karakaya Akademi Youtube channel in ENGLISH or in TURKISH

You can access all the tutorials in this series from my blog at www.muratkarakaya.net

If you would like to follow up on Deep Learning tutorials, please subscribe to my YouTube Channel or follow my blog on muratkarakaya.net. Thank you!


Photo by Jess Bailey on Unsplash

Seq2Seq Learning Part C: Basic Encoder Decoder Architecture & Design

 

Seq2Seq Learning Part C: Basic Encoder-Decoder Architecture & Design

Welcome to the Part C of the Seq2Seq Learning Tutorial Series. In this tutorial, we will design a Basic Encoder-Decoder model to solve the sample Seq2Seq problem introduced in Part A.

We will use LSTM as the Recurrent Neural Network layer in Keras.

You can access all my SEQ2SEQ Learning videos on Murat Karakaya Akademi Youtube channel in ENGLISH or in TURKISH

You can access all the tutorials in this series from my blog at www.muratkarakaya.net

If you would like to follow up on Deep Learning tutorials, please subscribe to my YouTube Channel or follow my blog on muratkarakaya.net. Thank you!


Photo by Med Badr Chemmaoui on Unsplash

Seq2Seq Learning: PART D: Encoder-Decoder with Teacher Forcing

 

SEQ2SEQ LEARNING PART D: Encoder-Decoder with Teacher Forcing

Welcome to Part D of the Seq2Seq Learning Tutorial Series. In this tutorial, we will design an Encoder-Decoder model to be trained with “Teacher Forcing” to solve the sample Seq2Seq problem introduced in Part A.

We will use the LSTM layer in Keras as the Recurrent Neural Network.

You can access all my SEQ2SEQ Learning videos on Murat Karakaya Akademi Youtube channel in ENGLISH or in TURKISH

You can access all the tutorials in this series from my blog at www.muratkarakaya.netIf you would like to follow up on Deep Learning tutorials, please subscribe to my YouTube Channel or follow my blog on muratkarakaya.net.  You can also access this Colab Notebook using the link.

If you are ready, let’s get started!



Photo by Vedrana Filipović on Unsplash

Seq2Seq Learning PART E: Encoder-Decoder for Variable Input And Output Sizes: Padding & Masking

 

SEQ2SEQ LEARNING PART E: Encoder-Decoder for Variable Input And Output Sizes: Padding & Masking

Welcome to Part E of the Seq2Seq Learning Tutorial Series. In this tutorial, we will design an Encoder-Decoder model to handle variable-size input and output sequences by using Padding and Masking methods. We will train the model by using the Teacher Forcing technique which we covered in Part D.

You can access all my SEQ2SEQ Learning videos on Murat Karakaya Akademi Youtube channel in ENGLISH or in TURKISHYou can access all the tutorials in this series from my blog at www.muratkarakaya.netYou can access this Colab Notebook using the link.

If you would like to follow up on Deep Learning tutorials, please subscribe to my YouTube Channel or follow my blog on muratkarakaya.net.  

If you are ready, let’s get started!



Photo by Jeffrey Brandjes on Unsplash


Seq2Seq Learning PART F: Encoder-Decoder with Bahdanau & Luong Attention Mechanism

Seq2Seq Learning PART F: Encoder-Decoder with Bahdanau & Luong Attention Mechanism

Welcome to Part F of the Seq2Seq Learning Tutorial Series. In this tutorial, we will design an Encoder-Decoder model to handle longer input and output sequences by using two global attention mechanisms: Bahdanau & LuongDuring the tutorial, we will be using the Encoder-Decoder model developed in Part C.


First, we will observe that the Basic Encoder-Decoder model will fail to handle long input sequences. Then, we will discuss how to relate each output with all the inputs using the global attention mechanism. We will implement the Bahdanau attention mechanism as a custom layer in Keras by using subclassingThen, we will integrate the attention layer into the Encoder-Decoder model to efficiently process the longer data. After observing the effect of the attention layer on performance, we will depict the attention between inputs and outputs. Lastly, we will code the Luong attention.

You can access all my SEQ2SEQ Learning videos on Murat Karakaya Akademi Youtube channel in ENGLISH or in TURKISHYou can access all the tutorials in this series from my blog at www.muratkarakaya.netYou can access the whole code on Colab.

If you would like to follow up on Deep Learning tutorials, please subscribe to my YouTube Channel or follow my blog on muratkarakaya.net. Thank you!

If you are ready, let’s get started!


Photo by Bradyn Trollip on Unsplash

Wednesday, November 9, 2022

Sequence To Sequence Learning With Tensorflow & Keras Tutorial Series

Sequence To Sequence Learning With Tensorflow & Keras Tutorial Series

This is the Index page of the “SEQ2SEQ Learning in Deep Learning with TensorFlow & Keras” tutorial series.

You can access all the content of the series in English and Turkish as YouTube videos, Medium posts, and Collab / GitHub Jupyter Notebooks using the below links..

Last Updated: 27 May 2021



Encoder-Decoder Model with Global Attention

Tuesday, November 8, 2022

All About LSTM Tutorial Series

 

All About LSTM Tutorial Series

This is the index page of the “All About LSTM in Tensorflow & Keras” tutorial series. We will cover all the topics related to LSTM with sample implementations in Python Tensorflow KerasYou can access the codesvideos, and posts from the below links.

You may also like to check out the SEQ2SEQ (SEQUENCE TO SEQUENCE) Learning tutorial series in which the LSTM layer has been used heavily.

If you would like to learn more about Deep Learning with practical coding examples, please subscribe to the Murat Karakaya Akademi YouTube Channel or follow my blog on muratkarakaya.net.  Do not forget to turn on notifications so that you will be notified when new parts are uploaded.

Last updated: 11/05/2022



Photo by Laura Fuhrman on Unsplash