Skip to content

A small experiment with convolutional neural network in keras.

Notifications You must be signed in to change notification settings

Ankush7890/Bottle_Neck_Effect_using_MNIST

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

4 Commits
 
 
 
 
 
 

Repository files navigation

This jupyter notebook shows convolutional neural network of using simplest architects with only single dense layer to network with 2 convolutional layers alongwith max pooling. We observe the effects with various architects on the accuracies on train and test set of well-known MNIST dataset.

It also shows the so-called bottleneck effect when the number of filters chosen in the second convolutional layer are less than first layer. The backpropagation shuts off many filters in first layer in this case as the second layer acts as bottle neck. This is also major reason that commonly used architects like VGG, MobileNets have the features that the number of filters in convolutional layer increase as move deeper into the network. This notebook tries to find a reason for such a choice. In Simonyan et. al. 2014, the authors comment as the reason for this is to prevent the loss of information.

Releases

No releases published

Packages

No packages published