📚 Deep Learning – Overzicht

Github: [Link]

lena-voita.github.io: [Link]

https://youtu.be/RqYzrGO_ZfI?si=0uERP4ml0gXtSYSp

Transformers Explained: The Discovery That Changed AI Forever

The Most Important Algorithm in Machine Learning

12 Types of Activation Functions in Neural Networks: A Comprehensive Guide

Why are Transformers replacing CNNs?

Is Signal Processing The CURE For AI's ADHD?

Modern Machine Learning Fundamentals: Cross-attention

The moment we stopped understanding AI [AlexNet]

Why are Transformers replacing CNNs?

I Visualised Attention in Transformers

A Dive Into Multihead Attention, Self-Attention and Cross-Attention

Modern Machine Learning Fundamentals: Transformers

Attention in Encoder-Decoder Models: LSTM Encoder-Decoder with Attention

How I Finally Understood LLM Attention

attention explained #machinelearning #explained

Visual Guide to Transformer Neural Networks - (Episode 2) Multi-Head & Self-Attention

Masking in Gen AI Training: The Hidden Genius in Transformers

How Cross Attention Powers Translation in Transformers | Encoder-Decoder Explained

Attention is all you need (Transformer) - Model explanation (including math), Inference and Training

🧠 Machine Learning Reviews 🤖 Artificial Neural Networks with Keras 👁️ Deep Computer Vision met CNNs
• Basis ML-concepten • Neurale netwerken • Convolutielagen
• Overfitting & Regularization • Activation Functions • Feature Maps
• Supervised & Unsupervised • Loss Functions & Optimizers • Pooling Layers
• Evaluatie metrics • Model Training & Evaluation • Object Detection & Segmentation
🧮 Data Laden & Voorbewerken met TensorFlow 🕹️ Einops & Einsum 🖊️ Sequenties Verwerken met Recurrente en Convolutionele Neurale Netwerken
• tf.data API
• Einsum
• Keras Preprocessing Layers • Free vs Summation Indices
• TensorFlow Datasets (TFDS) • Einops functies → rearrange, reduce, repeat
• Performance-optimalisatie • multidimensionale tensors
💻 Natural Language Processing met RNNs & Attention

Transformer (Decoder-only | GPT-stijl) oefening

Recap machine learning:

Set Input (x) Output (y) y_true hier Doel
Training x_train y_train y_train Model leren
Validation x_val / x_valid y_val / y_valid y_val Tussentijds checken, overfitting
Test x_test y_test y_test Eindprestatie meten

y_true is geen aparte dataset, maar een algemene naam voor de “echte labels” die je vergelijkt met de voorspellingen (y_pred).