Showing posts with label Multi-Class Classsification. Show all posts
Showing posts with label Multi-Class Classsification. Show all posts

Saturday, November 19, 2022

Part F: Text Classification with a Convolutional (Conv1D) Layer in a Feed-Forward Network

 

Part F: Text Classification with a Convolutional (Conv1D) Layer in a Feed-Forward Network



Author: Murat Karakaya
Date created….. 17 09 2021
Date published… 11 03 2022
Last modified…. 29 12 2022

Description: This is the Part F of the tutorial series “Multi-Topic Text Classification with Various Deep Learning Models which covers all the phases of multi-class  text classification:

  • Exploratory Data Analysis (EDA),

We will design various Deep Learning models by using

  • Keras Embedding layer,

We will cover all the topics related to solving Multi-Class Text Classification problems with sample implementations in Python / TensorFlow / Keras environment.

We will use a Kaggle Dataset in which there are 32 topics and more than 400K total reviews.

If you would like to learn more about Deep Learning with practical coding examples,

You can access all the codes, videos, and posts of this tutorial series from the links below.

Accessible on:


PARTS

In this tutorial series, there are several parts to cover Text Classification with various Deep Learning Models topics. You can access all the parts from this index page.



Photo by Josh Eckstein on Unsplash

Tuesday, November 8, 2022

How to solve Multi-Class Classification Problems in Deep Learning with Tensorflow & Keras?

 

How to solve Multi-Class Classification Problems in Deep Learning with Tensorflow & Keras?

This is the third part of the “How to solve Classification Problems in Keras?” series. If you have not gone over Part A and Part Bplease review them before continuing with this tutorial. The link to all parts is provided below.

In this tutorial, we will focus on how to solve Multi-Class Classification Problems in Deep Learning with Tensorflow & KerasFirst, we will download the MNIST dataset. In multi-class classification problems, we have two options to encode the true labels by using either:

  • integer numbers, or
  • one-hot vector

We will experiment with both encodings to observe the effect of the combinations of various last-layer activation functions and loss functions on a Keras CNN model’s performance. In both experiments, we will discuss the relationship between Activation & Loss functionslabel encodings, and accuracy metrics in detail.

We will understand why sometimes we could get surprising results when using different parameter settings other than the generally recommended ones. As a result, we will gain insight into activation and loss functions and their interactions. In the end, we will summarize the experiment results in a cheat table.

You can access the code at Colab and all the posts of the classification tutorial series at muratkarakaya.netYou can watch all these parts on YouTube in ENGLISH or TURKISH as well.

If you would like to follow up on Deep Learning tutorials, please subscribe to my YouTube Channel or follow my blog on muratkarakaya.net Do not forget to turn on Notifications so that you will be notified when new parts are uploaded.

If you are ready, let’s get started!



Photo by Debby Hudson on Unsplash