Practical Deep Learning: A Python-Based Introduction. 2 Ed

Practical Deep Learning: A Python-Based Introduction. 2 Ed

Practical Deep Learning: A Python-Based Introduction. 2 Ed
Автор: Kneusel Ronald T.
Дата выхода: 2025
Издательство: No Starch Press, Inc.
Количество страниц: 759
Размер файла: 6.0 MB
Тип файла: PDF
Добавил: Aleks-5
 Проверить на вирусы

Cover Page....2

Title Page....3

Copyright Page....4

Dedication Page....6

About the Author....7

About the Technical Reviewer....8

BRIEF CONTENTS....9

CONTENTS IN DETAIL....11

FOREWORD TO THE FIRST EDITION....24

ACKNOWLEDGMENTS....27

INTRODUCTION....28

Who This Book Is For....30

What You Can Expect to Learn....30

About This Book....31

Terminology....32

What’s New in the Second Edition....34

Synopsis....35

0 ENVIRONMENT AND MATHEMATICAL PRELIMINARIES....39

The Operating Environment....39

NumPy....40

scikit-learn....40

TensorFlow with Keras....40

Installing the Toolkits....41

Basic Linear Algebra....43

Vectors....43

Matrices....44

Vector and Matrix Multiplication....44

Statistics and Probability....46

Descriptive Statistics....46

Probability Distributions....47

Statistical Tests....48

Graphics Processing Units....49

Summary....50

PART I DATA IS EVERYTHING....51

1 IT’S ALL ABOUT THE DATA....52

Classes and Labels....52

Features and Feature Vectors....53

Types of Features....54

Feature Selection and the Curse of Dimensionality....57

Qualities of a Good Dataset....60

Interpolation and Extrapolation....61

The Parent Distribution....63

Prior Class Probabilities....64

Confusers....66

Dataset Size....66

Data Preparation....67

Scaling Features....68

Dealing with Missing Features....74

Training, Validation, and Test Data....75

The Three Subsets....76

Dataset Partitioning....77

k-Fold Cross-Validation....84

Data Analysis....86

How to Find Problems in the Data....87

Cautionary Tales....92

Summary....93

2 BUILDING THE DATASETS....94

Irises....94

Breast Cancer....97

MNIST Digits....100

CIFAR-10....104

Data Augmentation....107

Reasoning....108

Methods....110

The Iris Dataset....111

The CIFAR-10 Dataset....118

Summary....123

PART II CLASSICAL MACHINE LEARNING....125

3 INTRODUCTION TO MACHINE LEARNING....126

Nearest Centroid....127

k-Nearest Neighbors....132

Naive Bayes....134

Tree Classifiers....139

Recursion Primer....143

Decision Trees....144

Random Forests....146

Support Vector Machines....148

Margins....148

Support Vectors....151

Optimization....152

Kernels....152

Summary....154

4 EXPERIMENTS WITH CLASSICAL MODELS....155

Experiments with the Iris Dataset....155

Testing the Classical Models....156

Implementing a Nearest-Centroid Classifier....160

Experiments with the Breast Cancer Dataset....162

Running Two Initial Tests....163

Testing the Effect of Random Splits....166

Adding k-Fold Validation....168

Searching for Hyperparameters....175

Experiments with the MNIST Dataset....182

Testing the Classical Models....182

Analyzing Runtimes....190

Experimenting with PCA Components....193

Scrambling the Dataset....196

Classical Model Summary....198

Nearest Centroid....198

k-Nearest Neighbors....199

Naive Bayes....199

Decision Trees....200

Random Forests....200

Support Vector Machines....201

When to Use Classical Models....202

Handling Small Datasets....202

Dealing with Reduced Computational Requirements....202

Having Explainable Models....203

Working with Vector Inputs....203

Summary....204

PART III NEURAL NETWORKS....205

5 INTRODUCTION TO NEURAL NETWORKS....206

Anatomy of a Neural Network....207

The Neuron....208

Activation Functions....210

The Architecture of a Network....215

Output Layers....217

Weight and Bias Representation....220

A Simple Neural Network Implementation....221

Building the Dataset....222

Implementing the Neural Network....224

Training and Testing the Neural Network....227

Summary....230

6 TRAINING A NEURAL NETWORK....231

A High-Level Overview....231

Gradient Descent....233

Finding Minimums....235

Updating the Weights....237

Stochastic Gradient Descent....238

Batches and Minibatches....238

Convex vs. Nonconvex Functions....240

When to Stop Training....242

The Learning Rate....244

Momentum....245

Backpropagation....245

A Simple Example....247

An Abstract Example....251

Loss Functions....256

Absolute and Mean Squared Error Loss....256

Cross-Entropy Loss....257

Weight Initialization....259

Managing Model Complexity and Generalization....261

Overfitting....262

Regularization....265

L2 Regularization....266

Dropout....268

Summary....270

7 EXPERIMENTS WITH NEURAL NETWORKS....273

The Dataset....273

The MLPClassifier Class....274

Architecture and Activation Functions....275

The Code....275

The Results....280

Batch Size....284

Base Learning Rate....289

Training-Set Size....292

L2 Regularization....293

Momentum....296

Weight Initialization....298

Feature Ordering....303

Summary....305

8 EVALUATING MODELS....306

Definitions and Assumptions....306

Why Accuracy Is Not Enough....307

The 2×2 Confusion Matrix....310

Metrics Derived from the 2×2 Confusion Matrix....314

Deriving Metrics from the 2×2 Table....314

Using Metrics to Interpret Models....318

More-Advanced Metrics....320

Informedness and Markedness....321

F1 Score....322

Cohen’s Kappa....322

Matthews Correlation Coefficient....323

Metric Implementation....324

The Receiver Operating Characteristics Curve....326

Gathering the Models....326

Plotting the Metrics....328

Exploring the ROC Curve....330

Comparing Models with ROC Analysis....333

Generating an ROC Curve....336

Handling Multiple Classes....339

Extending the Confusion Matrix....340

Calculating Weighted Accuracy....344

Considering the Multiclass Matthews Correlation Coefficient....347

Summary....348

PART IV CONVOLUTIONAL NEURAL NETWORKS....350

9 INTRODUCTION TO CONVOLUTIONAL NEURAL NETWORKS....351

Why Convolutional Neural Networks?....352

Convolution....353

Scanning with the Kernel....353

Using Convolution for Image Processing....357

Anatomy of a Convolutional Neural Network....359

Exploring the Types of Layers....360

Passing Data Through the CNN....362

Convolutional Layers....363

How They Work....364

In Action....367

Multiple Layers....370

Initialization....372

Pooling Layers....372

Fully Connected Layers....374

Fully Convolutional Layers....376

How the CNN Operates....378

Summary....384

10 EXPERIMENTS WITH KERAS AND MNIST....386

Building CNNs in Keras....386

Loading the MNIST Data....387

Building the Model....389

Training and Evaluating the Model....392

Plotting the Error....395

Basic Experiments....398

Architecture Experiments....399

Training-Set Size, Minibatches, and Epochs....403

Optimizers....408

Fully Convolutional Networks....410

Building and Training the Model....411

Making the Test Images....414

Testing the Model....416

Scrambled MNIST Digits....427

Summary....428

11 EXPERIMENTS WITH CIFAR-10....430

A CIFAR-10 Refresher....430

The Full CIFAR-10 Dataset....432

Building the Models....432

Analyzing the Models....437

Animal or Vehicle?....439

Binary or Multiclass?....446

Using the Augmented CIFAR-10 Dataset....451

Summary....457

12 A CASE STUDY: CLASSIFYING AUDIO SAMPLES....459

Building the Dataset....459

Augmenting the Dataset....461

Preprocessing the Data....466

Classifying the Audio Features....468

With Classical Models....468

With a Traditional Neural Network....471

With a Convolutional Neural Network....472

Spectrograms....479

Classifying Spectrograms....484

Ensembles....489

Summary....495

PART V ADVANCED NETWORKS AND GENERATIVE AI....497

13 ADVANCED CNN ARCHITECTURES....498

The Keras Functional API....499

VGG....504

Standardizing with Batch Normalization....505

Applying Dropout After Convolutional Layers....506

Building the VGG8 Model....507

Testing the VGG8 Model....512

ResNet....517

Mitigating the Vanishing Gradient Problem....519

Exploring ResNet Configurations....521

Building the ResNet-18 Model....523

Testing the ResNet-18 Model....526

MobileNet....528

Implementing Depthwise Convolutions....528

Building the MobileNet Model....532

Testing the MobileNet Model....533

Building an Ensemble....534

Summary....537

14 FINE-TUNING AND TRANSFER LEARNING....539

Fine-Tuning a Pretrained Model....540

VGG16 and MobileNet with CIFAR-10....541

Experiments....546

A Study in Fine-Tuning....550

Creating New Features with Transfer Learning....555

Extracting CIFAR-10 Features with VGG16....555

Training Classical Models with CIFAR-10 Features....560

Detecting Anomalies with CIFAR-10 Features....562

Retrieving Images with CIFAR-10 Features....565

Summary....572

15 FROM CLASSIFICATION TO LOCALIZATION....574

Detection Experiments with MNIST....575

Building the Dataset....576

Building the Model....578

Running the Model....581

Running an Advanced Detection Model....590

Semantic Segmentation with U-Net....593

Implementing the U-Net Model....595

Running the Model and Interpreting Its Output....599

Multilabel Classification....605

Dataset....606

Performance....608

Summary....611

16 SELF-SUPERVISED LEARNING....613

Building the Unlabeled Dataset....614

Rotation Prediction....615

Building and Testing RotNet....616

Exploring Variations on the RotNet Theme....622

Examining RotNet Features....626

Fine-Tuning Experiments....630

Siamese Networks....633

Building and Testing the Siamese Networks....634

Examining Siamese Network Features....640

Fine-Tuning Experiments....642

For Further Exploration....644

Summary....644

17 GENERATIVE ADVERSARIAL NETWORKS....646

How GANs Work....646

Unconditional GANs....647

Building a GAN with Multilayer Perceptrons....647

Building a GAN with Convolutional Layers....650

Experimenting with Unconditional GANs....653

Conditional GANs....659

Implementing a Conditional GAN....659

Experimenting with Conditional GANs....662

Exploring the Latent Space....663

Intrinsic Dimensionality....663

Latent Space Interpolation....665

Summary....667

18 LARGE LANGUAGE MODELS....668

Understanding Large Language Models....670

Evaluating the LLM Block Diagram....671

Tokenizing and Embedding....672

Exploring the Transformer Layer....675

Predicting the Next Token....680

In-Context Learning....683

Configuring the Experiment....683

Running the Experiment....685

Testing Another LLM....689

Running LLMs Locally....690

Representing Parameters with Quantization....691

Installing Ollama....692

Installing Open Source LLMs....693

Chatting with a Simple Chatbot....694

LLM Embeddings....697

Embeddings Encode Meaning....698

Semantic Search and Retrieval-Augmented Generation....702

Sentiment Analysis....708

LLMs and Images....713

Describing Images....713

Classifying Images with an LLM?....717

Are LLMs Creative?....719

The Divergent Association Task....719

DAT Scores as a Function of Temperature....722

Creative Writing as a Function of Temperature....725

Summary....728

AFTERWORD....730

INDEX....733

Deep learning made simple.

Dip into deep learning without drowning in theory with this fully updated edition of Practical Deep Learning from experienced author and AI expert Ronald T. Kneusel.

After a brief review of basic math and coding principles, you’ll dive into hands-on experiments and learn to build working models for everything from image analysis to creative writing, and gain a thorough understanding of how each technique works under the hood. Whether you’re a developer looking to add AI to your toolkit or a student seeking practical machine learning skills, this book will teach you:

  • How neural networks work and how they’re trained
  • How to use classical machine learning models
  • How to develop a deep learning model from scratch
  • How to evaluate models with industry-standard metrics
  • How to create your own generative AI models

Each chapter emphasizes practical skill development and experimentation, building to a case study that incorporates everything you’ve learned to classify audio recordings. Examples of working code you can easily run and modify are provided, and all code is freely available on GitHub. With Practical Deep Learning, second edition, you’ll gain the skills and confidence you need to build real AI systems that solve real problems.

New to this edition: 

Material on computer vision, fine-tuning and transfer learning, localization, self-supervised learning, generative AI for novel image creation, and large language models for in-context learning, semantic search, and retrieval-augmented generation (RAG).


Похожее:

Список отзывов:

Нет отзывов к книге.