Showing posts with label TF Data Pipeline. Show all posts
Showing posts with label TF Data Pipeline. Show all posts

Tuesday, November 8, 2022

tf.data: Build Efficient TensorFlow Input Pipelines for Image Datasets

 tf.data: Build Efficient TensorFlow Input Pipelines for Image Datasets


This tutorial will focus on how to Build Efficient TensorFlow Input Pipelines for Image Datasets in Deep Learning with Tensorflow & Keras.

First, we will review the tf.data library. Then, we will download a sample image and label files. After gathering all the image file paths in the directories, we will merge file names with labels to create the train and test datasets. Using tf.data.Dataset methods, we will learn how to map, prefetch, cache, and batch the datasets correctly so that the data input pipeline will be efficient in terms of time and performance. We will discuss how map, prefetch, cache, and batch functions affect the performance of the tf.data.Dataset input pipeline performance.

Moreover, we will see how to use the TensorBoard add-on “TF Profiler” for monitoring the performance and bottlenecks of the tf.data input pipeline.

You can access this Colab Notebook using the linkYou can access all the tf.data: Tensorflow Data Pipeline tutorials at muratkarakaya.net.

If you would like to learn more about Deep Learning with practical coding examples, please subscribe to my YouTube Channel or follow my blog on muratkarakaya.netDo not forget to turn on Notifications so that you will be notified when new parts are uploaded.

If you are ready, let’s get started!



Building an Efficient TensorFlow Input Pipeline for Character-Level Text Generation

Building an Efficient TensorFlow Input Pipeline for Character-Level Text Generation

This tutorial is the second part of the “Text Generation in Deep Learning with Tensorflow & Keras” series. 

In this tutorial series, we have been covering all the topics related to Text Generation with sample implementations in PythonIn this tutorial, we will focus on how to build an Efficient TensorFlow Input Pipeline for Character-Level Text Generation

First, we will download a sample corpus (text file). After opening the file and reading it line-by-line, we will convert it to a single line of text. Then, we will split the text into input character sequence (X) and output character (y). 

Using tf.data.Dataset and Keras TextVectorization methods, we will

  • preprocess the text,
  • convert the characters into integer representation,
  • prepare the training dataset,
  • and optimize the data pipeline.

Thus, in the end, we will be ready to train a Language Model for character-level text generation.

If you would like to learn more about Deep Learning with practical coding examples, please subscribe to Murat Karakaya Akademi YouTube Channel or follow my blog on muratkarakaya.net. Do not forget to turn on notifications so that you will be notified when new parts are uploaded.



Photo by Harry Grout on Unsplash

Building an Efficient TensorFlow Input Pipeline for Word-Level Text Generation

 Building an Efficient TensorFlow Input Pipeline for Word-Level Text Generation


This tutorial is the third part of the “Text Generation in Deep Learning with Tensorflow & Keras” series.

In this series, we have been covering all the topics related to Text Generation with sample implementations in PythonThis tutorial will focus on how to build an Efficient TensorFlow Input Pipeline for Word-Level Text GenerationFirst, we will download a sample corpus (text file). After opening the file and reading it line-by-line, we will split the text into words. Then, we will generate pairs including an input word sequence (X) and an output word (y).

Using tf.data API and Keras TextVectorization methods, we will

  • preprocess the text,
  • convert the words into integer representation,
  • prepare the training dataset from the pairs,
  • and optimize the data pipeline.

Thus, in the end, we will be ready to train a Language Model for word-level text generation.

If you would like to learn more about Deep Learning with practical coding examples, please subscribe to Murat Karakaya Akademi YouTube Channel or follow my blog on muratkarakaya.net. Do not forget to turn on notifications so that you will be notified when new parts are uploaded.

If you are ready, let’s get started!



Photo by Quinten de Graaf on Unsplash

Character Level Text Generation with an LSTM Model

 

Character Level Text Generation with an LSTM Model

This tutorial is the fifth part of the “Text Generation in Deep Learning with Tensorflow & Keras” series. In this series, we have been covering all the topics related to Text Generation with sample implementations in Python, Tensorflow & Keras

In this tutorial, we will focus on how to build a Language Model using Keras LSTM layer for Character Level Text GenerationFirst, we will download a sample corpus (text file). After opening the file, we will apply the TensorFlow input pipeline that we have developed in Part B to prepare the training dataset by preprocessing and splitting the text into input character sequence (X) and output character (y). Then, we will design an LSTM-based Language Model and trai n it using the train set. Later on, we will apply several sampling methods that we have implemented in Part D to generate text and observe the effect of these sampling methods on the generated text. Thus, in the end, we will have a trained LSTM-based Language Model for character-level text generation with three sampling methods.

You can access all the parts of the Text Generation in Deep Learning with Tensorflow & Keras tutorial series on my blog at muratkarakaya.netYou can watch all these parts on the Murat Karakaya Akademi channel on YouTube in ENGLISH or TURKISHYou can access this Colab Notebook using the link.

If you would like to learn more about Deep Learning with practical coding examples, please subscribe to Murat Karakaya Akademi YouTube Channel or follow my blog at muratkarakaya.net. Do not forget to turn on notifications so that you will be notified when new parts are uploaded.

If you are ready, let’s get started!



Photo by Jan Huber on Unsplash

tf.data: Tensorflow Data Pipelines Tutorial Series

 

tf.data: Tensorflow Data Pipelines Tutorial Series

INDEX PAGE: This is the index page of the “tf.data: Tensorflow Data Pipelines” series.

We will cover all the topics related to tf.data Tensorflow Data Pipeline with sample implementations in Python Tensorflow Keras.

You can access the codesvideos, and posts from the below links.

If you would like to learn more about Deep Learning with practical coding examples, please subscribe to my YouTube Channel or follow my blog on Blogger. Do not forget to turn on notifications so that you will be notified when new parts are uploaded.



Photo by Florian Wächter on Unsplash

Friday, November 4, 2022

How to save and load a TensorFlow / Keras Model with Custom Objects?

 

How to save and load a TensorFlow / Keras Model with Custom Objects?

Author: Murat Karakaya
Date created: 30 May 2021
Last modified: 30 July 2021
Description: This tutorial will design and train a Keras model (miniature GPT3) with some custom objects (custom layers). We aim to learn how to save the trained model as a whole and load the saved model “TensorFlow SevedModel”.
Accessible on:



Photo by Markus Winkler on Unsplash

Keras Text Vectorization Layer: Configure, Adapt, Use, Save, Load, and Deploy

 

Keras Text Vectorization Layer: Configure, Adapt, Use, Save, Load, and Deploy

Author: Murat Karakaya
Date created: 05 Oct 2021
Last modified: 18 March 2023
Description: This is a tutorial about how to build, adapt, use, save, load, and deploy the Keras TextVectorization layer. You can access this tutorial on YouTube in English and Turkish. TensorFlow Keras Text Vectorization Katmanı” / “TensorFlow Keras Text Vectorization Layer”. 

In this tutorial, we will download a Kaggle Dataset in which there are 32 topics and more than 400K total reviews. We will use this dataset for a multi-class text classification task.

Our main aim is to learn how to effectively use the Keras TextVectorization layer in Text Processing and Text Classification.

The tutorial has 5 parts:

  • PART A: BACKGROUND
  • PART B: KNOW THE DATA
  • PART C: USE KERAS TEXT VECTORIZATION LAYER
  • PART D: BUILD AN END-TO-END MODEL
  • PART E: DEPLOY END-TO-END MODEL TO HUGGINGFACE SPACES USING GRADIO
  • SUMMARY

At the end of this tutorial, we will cover:

  • What a Keras TextVectorization layer is
  • Why we need to use a Keras TextVectorization layer in Natural Language Processing (NLP) tasks
  • How to employ a Keras TextVectorization layer in Text Preprocessing
  • How to integrate a Keras TextVectorization layer to a trained model
  • How to save and load a Keras TextVectorization layer and a model with a Keras TextVectorization layer
  • How to integrate a Keras TextVectorization layer with TensorFlow Data Pipeline API (tf.data)
  • How to design, train, save, and load an End-to-End model using Keras TextVectorization layer
  • How to deploy the End-to-End model with a Keras  TextVectorization  layer implemented with a custom standardize (custom_standardization) function using the Gradio library and the HuggingFace Spaces

Accessible on:





Photo by Francois Olwage on Unsplash

Tuesday, November 1, 2022

Multi-Topic (Multi-Class) Text Classification With Various Deep Learning Models Tutorial Series

Multi-Topic (Multi-Class) Text Classification With Various Deep Learning Models Tutorial Series

Index Page

This is the index page of the “Multi-Topic (Multi-Class) Text Classification With Various Deep Learning Models” tutorial series.

Author: Murat Karakaya
Date created….. 17 Sept 2021
Date published… 11 March 2022
Last modified…. 09 April 2023

Description: This is a tutorial series that covers all the phases of text classification: Exploratory Data Analysis (EDA) of text, text preprocessing, and multi-class (multi-topic) text classification using the TF Data Pipeline and the Keras TextVectorization preprocessing layer.

We will design various Deep Learning models by using the Keras Embedding layer, Convolutional (Conv1D) layer, Recurrent (LSTM) layer, Transformer Encoder block, and pre-trained transformer (BERT).

We will use a Kaggle Dataset with 32 topics and more than 400K reviews.

We will cover all the topics related to solving Multi-Class Text Classification problems with sample implementations in Python TensorFlow Keras.

You can access the codesvideos, and posts from the below links.

If you would like to learn more about Deep Learning with practical coding examples, please subscribe to the Murat Karakaya Akademi YouTube Channel or follow my blog on muratkarakaya.net. Remember to turn on notifications so that you will be notified when new parts are uploaded.



Photo by Patrick Tomasso on Unsplash

PARTS

In this tutorial series, there will be several parts to cover the “Text Classification with various Deep Learning Models” in detail as follows. 

You can access all these parts on YouTube in ENGLISH or TURKISH!

You can access the complete codes as Colab Notebooks using the links given in each video description (Eng/TR) or you can visit the Murat Karakaya Akademi Github Repo.

Comments or Questions?

Please share your Comments or Questions.

Thank you in advance.

Do not forget to check out the following parts!

Take care!

You can access Murat Karakaya Akademi via:

YouTube

Facebook

Instagram

LinkedIn

GitHub

Kaggle

muratkarakaya.net