AI 文摘

541页《理解深度学习》新书最新版


  • By AiBard123
  • November 20, 2023 - 2 min read



作者: 人工智能技术与时代人物风云 来源: 人工智能技术与时代人物风云

来自巴斯大学计算机科学教授Simon J.D. Prince撰写的《理解深度学习》新书,共有19章,从机器学习基础概念到深度学习各种模型,包括最新的Transformer和图神经网络,比较系统全面,值得关注。本书目录如下所示:

  • Chapter 1 - 导论 Introduction

  • Chapter 2 - 监督学习 Supervised learning

  • Chapter 3 - 浅层神经网络 Shallow neural networks

  • Chapter 4 - 深度神经网络 Deep neural networks

  • Chapter 5 - 损失函数 Loss functions

  • Chapter 6 - 训练模型 Training models

  • Chapter 7 - 梯度与初始化 Gradients and initialization

  • Chapter 8 - 度量性能 Measuring performance

  • Chapter 9 - 正则化 Regularization

  • Chapter 10 - 卷积网络 Convolutional nets

  • Chapter 11 - 残差网络 Residual networks and BatchNorm

  • Chapter 12 - Transformers

  • Chapter 13 - 图神经网络 Graph neural networks

  • Chapter 14 - 无监督学习Unsupervised learning

  • Chapter 15 - 生成对抗网络 Generative adversarial networks

  • Chapter 16 - Normalizing flows

  • Chapter 17 - 变分自编码器 Variational auto-encoders

  • Chapter 18 - 扩散模型 Diffusion models

  • Chapter 19 - 深度强化学习 Deep reinforcement learning

  • Chapter 20 - 为什么深度学习work?Why does deep learning work?

  • Notebook 1.1 - Background mathematics: ipynb/colab

  • Notebook 2.1 - Supervised learning: ipynb/colab

  • Notebook 3.1 - Shallow networks I: ipynb/colab

  • Notebook 3.2 - Shallow networks II: ipynb/colab

  • Notebook 3.3 - Shallow network regions: ipynb/colab

  • Notebook 3.4 - Activation functions: ipynb/colab

  • Notebook 4.1 - Composing networks: ipynb/colab

  • Notebook 4.2 - Clipping functions: ipynb/colab

  • Notebook 4.3 - Deep networks: ipynb/colab

  • Notebook 5.1 - Least squares loss: ipynb/colab

  • Notebook 5.2 - Binary cross-entropy loss: ipynb/colab

  • Notebook 5.3 - Multiclass cross-entropy loss: ipynb/colab

  • Notebook 6.1 - Line search: ipynb/colab

  • Notebook 6.2 - Gradient descent: ipynb/colab

  • Notebook 6.3 - Stochastic gradient descent: ipynb/colab

  • Notebook 6.4 - Momentum: ipynb/colab

  • Notebook 6.5 - Adam: ipynb/colab

  • Notebook 7.1 - Backpropagation in toy model: ipynb/colab

  • Notebook 7.2 - Backpropagation: ipynb/colab

  • Notebook 7.3 - Initialization: ipynb/colab

  • Notebook 8.1 - MNIST-1D performance: ipynb/colab

  • Notebook 8.2 - Bias-variance trade-off: ipynb/colab

  • Notebook 8.3 - Double descent: ipynb/colab

  • Notebook 8.4 - High-dimensional spaces: ipynb/colab

  • Notebook 9.1 - L2 regularization: ipynb/colab

  • Notebook 9.2 - Implicit regularization: ipynb/colab

  • Notebook 9.3 - Ensembling: ipynb/colab

  • Notebook 9.4 - Bayesian approach: ipynb/colab

  • Notebook 9.5 - Augmentation ipynb/colab

  • Notebook 10.1 - 1D convolution: ipynb/colab

  • Notebook 10.2 - Convolution for MNIST-1D: ipynb/colab

  • Notebook 10.3 - 2D convolution: ipynb/colab

  • Notebook 10.4 - Downsampling & upsampling: ipynb/colab

  • Notebook 10.5 - Convolution for MNIST: ipynb/colab

  • Notebook 11.1 - Shattered gradients: ipynb/colab

  • Notebook 11.2 - Residual networks: ipynb/colab

  • Notebook 11.3 - Batch normalization: ipynb/colab

  • Notebook 12.1 - Self-attention: ipynb/colab

  • Notebook 12.2 - Multi-head self-attention: ipynb/colab

  • Notebook 12.3 - Tokenization: ipynb/colab

  • Notebook 12.4 - Decoding strategies: ipynb/colab

  • Notebook 13.1 - Encoding graphs: ipynb/colab

  • Notebook 13.2 - Graph classification : ipynb/colab

  • Notebook 13.3 - Neighborhood sampling: ipynb/colab

  • Notebook 13.4 - Graph attention: ipynb/colab

  • Notebook 15.1 - GAN toy example: ipynb/colab

  • Notebook 15.2 - Wasserstein distance: ipynb/colab

  • Notebook 16.1 - 1D normalizing flows: ipynb/colab

  • Notebook 16.2 - Autoregressive flows: ipynb/colab

  • Notebook 16.3 - Contraction mappings: ipynb/colab

  • Notebook 17.1 - Latent variable models: ipynb/colab

  • Notebook 17.2 - Reparameterization trick: ipynb/colab

  • Notebook 17.3 - Importance sampling: ipynb/colab

  • Notebook 18.1 - Diffusion encoder: ipynb/colab

  • Notebook 18.2 - 1D diffusion model: ipynb/colab

  • Notebook 18.3 - Reparameterized model: ipynb/colab

  • Notebook 18.4 - Families of diffusion models: ipynb/colab

  • Notebook 19.1 - Markov decision processes: ipynb/colab

  • Notebook 19.2 - Dynamic programming: ipynb/colab

  • Notebook 19.3 - Monte-Carlo methods: ipynb/colab

  • Notebook 19.4 - Temporal difference methods: ipynb/colab

  • Notebook 19.5 - Control variates: ipynb/colab

  • Notebook 20.1 - Random data: ipynb/colab

  • Notebook 20.2 - Full-batch gradient descent: ipynb/colab

  • Notebook 20.3 - Lottery tickets: ipynb/colab

  • Notebook 20.4 - Adversarial attacks: ipynb/colab

  • Notebook 21.1 - Bias mitigation: ipynb/colab

  • Notebook 21.2 - Explainability: ipynb/colab

参考文献:
[1]https://udlbook.github.io/udlbook/

更多AI工具,参考Github-AiBard123国内AiBard123

可关注我们的公众号:每天AI新工具