Showing posts with label GPT. Show all posts
Showing posts with label GPT. Show all posts

Saturday, November 5, 2022

Fundamentals of Controllable Text Generation

 

Fundamentals of Controllable Text Generation

Author: Murat Karakaya
Date created: 21 April 2021
Last modified: 24 May 2021
Description: This is an introductory tutorial on Controllable Text Generation in Deep Learning which is the second part of the “Controllable Text Generation with Transformers” series. This series will focus on developing TensorFlow (TF) / Keras implementation of Controllable Text Generation from scratch. You can access all these parts from my blog at muratkarakaya.net.

Before getting started, I assume that you have already reviewed:

Please ensure that you have completed the above tutorial series to easily follow the below discussions.

Accessible on:



Photo by Chris Leipelt on Unsplash

Friday, November 4, 2022

How to save and load a TensorFlow / Keras Model with Custom Objects?

 

How to save and load a TensorFlow / Keras Model with Custom Objects?

Author: Murat Karakaya
Date created: 30 May 2021
Last modified: 30 July 2021
Description: This tutorial will design and train a Keras model (miniature GPT3) with some custom objects (custom layers). We aim to learn how to save the trained model as a whole and load the saved model “TensorFlow SevedModel”.
Accessible on:



Photo by Markus Winkler on Unsplash

Multi-Class Text Classification with a GPT3 Transformer block: An End-to-End Example

 

Multi-Class Text Classification with a GPT3 Transformer block: An End-to-End Example

Author: Murat Karakaya & Cansen Çağlayan
Date created: 05 Oct 2021
Last modified: 19 Oct 2021
Description: This tutorial has 2 parts as explained below. Part A: Data Analysis & Text Preprocessing and Part B: Text Classification. 



                                       Photo by Håkon Grimstad on Unsplash