Illustrated WideResnet

Wide Residual Networks To follow along please refer to this Kaggle Kernel Introduction Prior to the introduction of Wide Residual Networks (WRNs) by Sergey Zagoruyko and Nikos Komodakis, deep residual networks were shown to have a fractional increase in performance but at

Named Entity Recognition using Reformers

In-depth end-to-end tutorial on implementing Named Entity Recognition on a Kaggle Dataset using the Trax Framework

Subset Selection

Introduction The time and space complexity of any classifier or regressor depends on the number of inputs(d) and the size of the data sample(N). Subset Selection is a type of feature extraction method. The process of feature extraction basically involves

Clipping and Sampling for Recurrent Neural Networks

Introduction In this blog post, I’ll walk you through the simple steps of clipping and sampling for recurrent neural networks. Gradient Clipping ensures that your gradients won’t explode, and you can converge to the optimal solution of the cost function easily. We’ll

CNN: Augmentation using Image Data Generator

An end-to-end tutorial for using Tensorflow to perform image classification by using Image Classification to enforce diversity in the training dataset

Basic RNN

Introduction In this blog, I’ll walk you through building the basics blocks of a Recurrent Neural Network.   Importing Packages Basic Functions Basic RNN Cells Basic LSTM Cells

Named Entity Recognition using Tensorflow

Introduction Named Entity Recognition is a common task in Information Extraction which classifies the “named entities” in an unstructured text corpus. Most of these Softwares have been made on an unannotated corpus.   Dataset used here is available at the

Continuous Bag of Words Model from Scratch

Introduction In this post, I’ll walk through an implementation of the Continuous Bag of Words Model for generating word embedding vectors.    Applications of Word Embeddings Semantic Analogies Sentiment Analysis  Classification of customer feedback Machine Translation  Information Extraction Question Answering 

Making an Auto Complete Program using N-Gram Models

Introduction To build an auto-complete system, we need to make a language model first. A language model essentially assigns a probability to a sequence of words. Thus the linguistically next word would have a higher probability.   Major Libraries  math

AI Poem Generator trained on Shakespeares work

Introduction In this blog, I’ll try to explain to you guys how you can make your own Natural Language Processing based program to generate your own poem by just giving the first line of the poem. As the program is