Receptive Field in CNNs
Understand receptive fields in CNNs: how convolutional layers expand their field of view and the gap between theoretical and effective receptive fields.
Explore machine learning concepts related to neural-networks. Clear explanations and practical insights.
Understand receptive fields in CNNs: how convolutional layers expand their field of view and the gap between theoretical and effective receptive fields.
Learn He (Kaiming) initialization for ReLU networks: why ReLU needs special weight initialization, variance flow, and dead neurons explained.
Learn Xavier (Glorot) initialization: how it balances forward signals and backward gradients to enable stable deep network training with tanh and sigmoid.
Learn batch normalization in deep learning: how normalizing layer inputs accelerates training, improves gradient flow, and acts as regularization.
Learn how skip connections and residual learning enable training of very deep neural networks. Understand the ResNet revolution with interactive visualizations.