Neural Network Architectures

Deep neural networks and Deep Learning are powerful and popular algorithms. Also, their success lays in the careful design of the neural network architecture. There are different types of architectures, designed for specific needs and for unique purposes. By studying them, we can have general idea of how they work and why they were designed the way they are.

1.LeNet5

-LeNet5 neural network architecture was created by Yann LeCun in the year 1994

-Activation functions are used in the form of tanh or sigmoids

-Extraction of spatial features are done by using convolution method

-There is sparse connection matrix between layers to avoid large computational cost

2.AlexNet

-In 2012, Alex Krizhevsky released AlexNet which was a deeper and much wider version of the LeNet

-Use of rectified linear units (ReLU) as activation function

-Overfitting of the model is avoided by selectively ignoring the single neurons by using the technique of dropout

-Complex hierarchies and objects can be learned using this architecture.

-The much more extensive neural network was created by scaling the insights of LeNet in AlexNet Architecture

3.VGG

-The VGG networks from Oxford used smaller 3×3 filters in each convolutional layers

-In VGG, smaller filters were used on the first layers of the network, which was avoided in LeNet architecture. In VGG, large filters of AlexNet like 9 x 9 or 11 x 11 were not used

-Recent Network Architectures such as ResNet and Inception are using this idea of multiple 3×3 convolutions in series

4.ResNet

-ResNet have a simple ideas: feed the output of two successive convolutional layer AND also bypass the input to the next layers!

-ResNet uses a fairly simple initial layers at the input (stem): a 7×7 conv layer followed with a pool of 2. Contrast this to more complex and less intuitive stems as in Inception V3, V4

-ResNet also uses a pooling layer plus softmax as final classifier

-It has been found that ResNet usually operates on blocks of relatively low depth ~20–30 layers, which act in parallel, rather than serially flow the entire length of the network

Thanks!

1 thought on “Neural Network Architectures”

Leave a Comment

Your email address will not be published. Required fields are marked *