Showing posts with label Text Classification. Show all posts
Showing posts with label Text Classification. Show all posts

Saturday, November 19, 2022

Part F: Text Classification with a Convolutional (Conv1D) Layer in a Feed-Forward Network

 

Part F: Text Classification with a Convolutional (Conv1D) Layer in a Feed-Forward Network



Author: Murat Karakaya
Date created….. 17 09 2021
Date published… 11 03 2022
Last modified…. 29 12 2022

Description: This is the Part F of the tutorial series “Multi-Topic Text Classification with Various Deep Learning Models which covers all the phases of multi-class  text classification:

  • Exploratory Data Analysis (EDA),

We will design various Deep Learning models by using

  • Keras Embedding layer,

We will cover all the topics related to solving Multi-Class Text Classification problems with sample implementations in Python / TensorFlow / Keras environment.

We will use a Kaggle Dataset in which there are 32 topics and more than 400K total reviews.

If you would like to learn more about Deep Learning with practical coding examples,

You can access all the codes, videos, and posts of this tutorial series from the links below.

Accessible on:


PARTS

In this tutorial series, there are several parts to cover Text Classification with various Deep Learning Models topics. You can access all the parts from this index page.



Photo by Josh Eckstein on Unsplash

Tuesday, November 8, 2022

How to solve Classification Problems in Deep Learning with Tensorflow & Keras?

 

How to solve Classification Problems in Deep Learning with Tensorflow & Keras?

Today, we will focus on how to solve Classification Problems in Deep Learning with Tensorflow & Keras.

When we design a model in Deep Neural Networks, we need to know how to select proper label encodingActivation, and Loss functions, along with accuracy metrics according to the classification task at hand.

Thus, in this tutorial, we will first investigate the types of Classification Problems. Then, we will see the most frequently used label encodings in Keras. We will learn how to select Activation & Loss functions according to the given classification type and label encoding. Moreover, we will examine the details of accuracy metrics in TensorFlow / Keras.

At the end of the tutorial, I hope that we will have a good understanding of these concepts and their implementation in Keras.

Contents:

  • types of Classification Problems,
  • possible label encodings,
  • Activation & Loss functions,
  • accuracy metrics

Furthermore, we will also discuss how the target encoding can affect the selection of Activation & Loss functions.

If you would like to learn more about Deep Learning with practical coding examples, please subscribe to my YouTube Channel or follow my blog on Blogger. Do not forget to turn on Notifications so that you will be notified when new parts are uploaded.

You can access this Colab Notebook using the link given in the video description below.

If you are ready, let’s get started!



Photo by Deon Black on Unsplash



How to solve Multi-Label Classification Problems in Deep Learning with Tensorflow & Keras?

How to solve Multi-Label Classification Problems in Deep Learning with Tensorflow & Keras?

This is the fourth part of the “How to solve Classification Problems in Keras?” series. Before starting this tutorial, I strongly suggest you go over Part A: Classification with Keras to learn all related concepts. 

In this tutorial, we will focus on how to solve Multi-Label Classification Problems in Deep Learning with Tensorflow & KerasFirst, we will download a sample Multi-label dataset. In multi-label classification problems, we mostly encode the true labels with multi-hot vectors. We will experiment with combinations of various last layer’s activation functions and loss functions of a Keras CNN model and we will observe the effects on the model’s performance.

During experiments, we will discuss the relationship between Activation & Loss functionslabel encodings, and accuracy metrics in detail. We will understand why sometimes we could get surprising results when using different parameter settings other than the generally recommended ones. As a result, we will gain insight into activation and loss functions and their interactions. In the end, we will summarize the experiment results in a cheat table.

You can access the code at Colab and all the posts of the classification tutorial series at muratkarakaya.netYou can watch all these parts on YouTube in ENGLISH or TURKISH as well.

If you would like to follow up on Deep Learning tutorials, please subscribe to my YouTube Channel or follow my blog on muratkarakaya.net Do not forget to turn on Notifications so that you will be notified when new parts are uploaded.

If you are ready, let’s get started!



Photo by Luca Martini on Unsplash


Friday, November 4, 2022

Bookmarks to the selected Deep Learning / Machine Learning Resources on the Web

 

Bookmarks to the selected Deep Learning / Machine Learning Resources on the Web

Author: Murat Karakaya
Date created: 19 May 2020
Last modified: 15 Dec 2021
Description: In this post, I share my bookmarks classified according to specific topics in Deep Learning / Machine Learning. Thus, you can save your time searching for similar information on the web. If you have any comments or updates please feel free to share with me!

If you are interested in Deep Learning / Machine learning, you can find hundreds of video tutorials with Python code samples in Jupyter notebooks at the following links:



Photo by Bernd Klutsch on Unsplash

Multi-Class Text Classification with a GPT3 Transformer block: An End-to-End Example

 

Multi-Class Text Classification with a GPT3 Transformer block: An End-to-End Example

Author: Murat Karakaya & Cansen Çağlayan
Date created: 05 Oct 2021
Last modified: 19 Oct 2021
Description: This tutorial has 2 parts as explained below. Part A: Data Analysis & Text Preprocessing and Part B: Text Classification. 



                                       Photo by Håkon Grimstad on Unsplash

Keras Text Vectorization Layer: Configure, Adapt, Use, Save, Load, and Deploy

 

Keras Text Vectorization Layer: Configure, Adapt, Use, Save, Load, and Deploy

Author: Murat Karakaya
Date created: 05 Oct 2021
Last modified: 18 March 2023
Description: This is a tutorial about how to build, adapt, use, save, load, and deploy the Keras TextVectorization layer. You can access this tutorial on YouTube in English and Turkish. TensorFlow Keras Text Vectorization Katmanı” / “TensorFlow Keras Text Vectorization Layer”. 

In this tutorial, we will download a Kaggle Dataset in which there are 32 topics and more than 400K total reviews. We will use this dataset for a multi-class text classification task.

Our main aim is to learn how to effectively use the Keras TextVectorization layer in Text Processing and Text Classification.

The tutorial has 5 parts:

  • PART A: BACKGROUND
  • PART B: KNOW THE DATA
  • PART C: USE KERAS TEXT VECTORIZATION LAYER
  • PART D: BUILD AN END-TO-END MODEL
  • PART E: DEPLOY END-TO-END MODEL TO HUGGINGFACE SPACES USING GRADIO
  • SUMMARY

At the end of this tutorial, we will cover:

  • What a Keras TextVectorization layer is
  • Why we need to use a Keras TextVectorization layer in Natural Language Processing (NLP) tasks
  • How to employ a Keras TextVectorization layer in Text Preprocessing
  • How to integrate a Keras TextVectorization layer to a trained model
  • How to save and load a Keras TextVectorization layer and a model with a Keras TextVectorization layer
  • How to integrate a Keras TextVectorization layer with TensorFlow Data Pipeline API (tf.data)
  • How to design, train, save, and load an End-to-End model using Keras TextVectorization layer
  • How to deploy the End-to-End model with a Keras  TextVectorization  layer implemented with a custom standardize (custom_standardization) function using the Gradio library and the HuggingFace Spaces

Accessible on:





Photo by Francois Olwage on Unsplash

Tuesday, November 1, 2022

Multi-Topic (Multi-Class) Text Classification With Various Deep Learning Models Tutorial Series

Multi-Topic (Multi-Class) Text Classification With Various Deep Learning Models Tutorial Series

Index Page

This is the index page of the “Multi-Topic (Multi-Class) Text Classification With Various Deep Learning Models” tutorial series.

Author: Murat Karakaya
Date created….. 17 Sept 2021
Date published… 11 March 2022
Last modified…. 09 April 2023

Description: This is a tutorial series that covers all the phases of text classification: Exploratory Data Analysis (EDA) of text, text preprocessing, and multi-class (multi-topic) text classification using the TF Data Pipeline and the Keras TextVectorization preprocessing layer.

We will design various Deep Learning models by using the Keras Embedding layer, Convolutional (Conv1D) layer, Recurrent (LSTM) layer, Transformer Encoder block, and pre-trained transformer (BERT).

We will use a Kaggle Dataset with 32 topics and more than 400K reviews.

We will cover all the topics related to solving Multi-Class Text Classification problems with sample implementations in Python TensorFlow Keras.

You can access the codesvideos, and posts from the below links.

If you would like to learn more about Deep Learning with practical coding examples, please subscribe to the Murat Karakaya Akademi YouTube Channel or follow my blog on muratkarakaya.net. Remember to turn on notifications so that you will be notified when new parts are uploaded.



Photo by Patrick Tomasso on Unsplash

PARTS

In this tutorial series, there will be several parts to cover the “Text Classification with various Deep Learning Models” in detail as follows. 

You can access all these parts on YouTube in ENGLISH or TURKISH!

You can access the complete codes as Colab Notebooks using the links given in each video description (Eng/TR) or you can visit the Murat Karakaya Akademi Github Repo.

Comments or Questions?

Please share your Comments or Questions.

Thank you in advance.

Do not forget to check out the following parts!

Take care!

You can access Murat Karakaya Akademi via:

YouTube

Facebook

Instagram

LinkedIn

GitHub

Kaggle

muratkarakaya.net