Get detailed into miracle’s of quantization


Whenever I work on deep learning projects to train a model and make it ready for production by saving the model, it gives me huge memory. Then I started researching to decrease the saved model memory and here I found a name called “Quantization”. I would like to explain more about this quantization with some code samples and the theory behind this.

Photo by Ussama Azam on Unsplash

There are two forms of quantization:

  1. post-training quantization.
  2. Quantization aware training.

Start with post-training quantization since it’s easier to use, though quantization aware training is often better for model accuracy.

The complete story behind my developer certificate.

I received my TensorFlow Developer Certificate!!

Initially, when I started working on machine learning projects, I felt cool learning and implementing those algorithms on related problems. But while coming to neural networks, it was hard to implement them for solutions due to the vastness in the subject of neural networks. So, Started using TensorFlow which made time to grab the things inside this API. Somewhere I heard If you see things are complicated then start them in simple. I did the same by starting basic syntax and knowing more about the API. …

Basic intuitions of Neural networks


When I started reading articles on neural networks, I faced a lot of struggles to understand the basics behind neural networks and how they work. Start reading more and more articles on the internet, grab those key points, and put them together into private notes for me. And, I thought to publish them for better understandings to others.

It would be fun to know the basics of any domain.

Photo by Alina Grubnyak on Unsplash


The perceptron is one of the simplest ANN Architectures, invented in 1957 by Frank Rosenblatt. It is a slightly different artificial neuron called TLU (Threshold Logic…

The toughest job in the field of Data Science is training DNN


For the past two years, I’m working in the Data Science field which always makes me learn and know better in the field. While coming to DNN, I feel more comfortable working and gather tricks to make neural networks run faster and get the best accuracy for my model. Through this article, I spread those tricks to make the toughest job into a simplified job.

Photo by Clint Adair on Unsplash

Training DNN’s is a tough job to complete due to computational time and effort. …

In detail with Bag of words, TF_IDF, RNN’s, GRU’s & LSTM’s.

Photo by Lazar Gugleta on Unsplash

Yes! Alexa is designed from NLP modelling.

But, how NLP works and how it is designed for modelling?

In this article, I will prepare data for NLP modelling. Moreover, I would explain everything in detail both in theory and code.

What is Natural Language Processing?

Natural Language Processing is the model of building to interact with machines in human languages. It is the subfield of linguistic, Computer Science & AI.

NLP is nothing but designing systems to interact by human commands. As commands are only text format, then model needs to be trained on…

Trouble-Free Concept in 5 min.

What is Time Series Forecasting ?

It is model to predict the future values based on previous observed values. It has certain data points at every time steps.

Time Series Applications :

Time Series widely used in non-stationary data such as

Stock Prices, Weather predictions , Economics, Retail sales etc…

Aspects when dealing with timeseries data :

  1. Stationarity
  2. Seasonality
  3. Auto Correlation

So, what these aspects represents ??

Stationarity :

Stationarity in time series is important and a timeseries is said to be stationary only if its statistical properties doesn’t change overtime. And most importantly maintains constant mean and variance.

Stationarity : It’s behaviour doesn’t change overtime.

Non- Stationarity : It’s behaviour changes overtime.

Seasonality :


Learn in 5 min

What is Machine Learning?

Teaching or Training the machines to perform and predict the outcomes.

Types of machine learning

  • Supervised learning : Consists of features and labels
  • Un-Supervised learning : Consists of only features

Supervised Learning

Takes features and labels in training dataset to learn and predict the outcomes.

  • Classification
  • Regression

Un-Supervised Learning

Contains only features and extracts the labels using features and patterns.

  • Reduction
  • Clustering
Types of Machine Learning

Machine Learning Algorithms

  1. Linear Regression
  2. Logistic Regression
  3. Naive Bayes
  4. Decision Tree
  5. Support vector machine
  6. K-Fold- Cross Validation
  7. Random Forest
  8. K-Means Clustering
  9. K-Nearest Neighbors
  10. Hierarchical Clustering
  11. Principle Component Analysis

Linear Regression

from sklearn.linear_model import LinearRegression
model = Linear_Regression()

Logistic Regression

from sklearn.linear_model import LogisticRegression
model = LogisticRegression()

Naive Bayes

from sklearn.naive_bayes import…

Akshith Kumar

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store