|
|
linguist.page@gmail.com
Home
»
Computational Linguistics
»
Machine Learning
»
Neural Networks & Deep Learning
»
Foundations
1.
Biological inspiration (neuron analogy)
2.
Perceptron
3.
Activation functions (sigmoid, tanh, ReLU, GELU, softmax)
4.
Multi-layer perceptron (MLP)
5.
Forward propagation
6.
Loss functions (MSE, cross-entropy, NLL)
7.
Backpropagation (chain rule in action)
8.
Weight initialization
9.
Batch normalization
10.
Dropout regularization
11.
Epochs, batches, iterations