Cover....1
Copyright....3
Packt Page....4
Contributors....5
Table of Contents....8
Preface....20
Chapter 1: Getting Started with Next-Generation Artificial Intelligence through Reinforcement Learning....30
Reinforcement learning concepts....31
How to adapt to machine thinking and become an adaptive thinker....33
Overcoming real-life issues using the three-step approach....34
Step 1 – describing a problem to solve: MDP in natural language....36
Watching the MDP agent at work....37
Step 2 – building a mathematical model: the mathematical representation of the Bellman equation and MDP....39
From MDP to the Bellman equation....39
Step 3 – writing source code: implementing the solution in Python....43
The lessons of reinforcement learning....45
How to use the outputs....47
Possible use cases....49
Machine learning versus traditional applications....52
Summary....53
Questions....53
Further reading....54
Chapter 2: Building a Reward Matrix – Designing Your Datasets....56
Designing datasets – where the dream stops and the hard work begins....57
Designing datasets....58
Using the McCulloch-Pitts neuron....58
The McCulloch-Pitts neuron....60
The Python-TensorFlow architecture....64
Logistic activation functions and classifiers....64
Overall architecture....64
Logistic classifier....65
Logistic function....66
Softmax....67
Summary....71
Questions....72
Further reading....72
Chapter 3: Machine Intelligence – Evaluation Functions and Numerical Convergence....74
Tracking down what to measure and deciding how to measure it....75
Convergence....77
Implicit convergence....78
Numerically controlled gradient descent convergence....78
Evaluating beyond human analytic capacity....85
Using supervised learning to evaluate a result that surpasses human analytic capacity....89
Summary....93
Questions....94
Further reading....94
Chapter 4: Optimizing Your Solutions with K-Means Clustering....96
Dataset optimization and control....97
Designing a dataset and choosing an MLDL model....98
Approval of the design matrix....99
Implementing a k-means clustering solution....103
The vision....103
The data....104
The strategy....105
The k-means clustering program....106
The mathematical definition of k-means clustering....107
The Python program....109
Saving and loading the model....113
Analyzing the results....114
Bot virtual clusters as a solution....115
The limits of the implementation of the k-means clustering algorithm....116
Summary....117
Questions....117
Further reading....118
Chapter 5: How to Use Decision Trees to Enhance K-Means Clustering....120
Unsupervised learning with KMC with large datasets....121
Identifying the difficulty of the problem....123
NP-hard – the meaning of P....123
NP-hard – the meaning of non-deterministic....124
Implementing random sampling with mini-batches....124
Using the LLN....125
The CLT....125
Using a Monte Carlo estimator....126
Trying to train the full training dataset....127
Training a random sample of the training dataset....127
Shuffling as another way to perform random sampling....129
Chaining supervised learning to verify unsupervised learning....131
Preprocessing raw data....132
A pipeline of scripts and ML algorithms....132
Step 1 – training and exporting data from an unsupervised ML algorithm....134
Step 2 – training a decision tree....135
Step 3 – a continuous cycle of KMC chained to a decision tree....139
Random forests as an alternative to decision trees....143
Summary....147
Questions....147
Further reading....148
Chapter 6: Innovating AI with Google Translate....150
Understanding innovation and disruption in AI....152
Is AI disruptive?....152
AI is based on mathematical theories that are not new....153
Neural networks are not new....153
Looking at disruption – the factors that are making AI disruptive....154
Cloud server power, data volumes, and web sharing of the early 21st century....154
Public awareness....155
Inventions versus innovations....155
Revolutionary versus disruptive solutions....156
Where to start?....156
Discover a world of opportunities with Google Translate....157
Getting started....157
The program....157
The header....157
Implementing Google's translation service....158
Google Translate from a linguist's perspective....159
Playing with the tool....160
Linguistic assessment of Google Translate....160
AI as a new frontier....164
Lexical field and polysemy....164
Exploring the frontier – customizing Google Translate with a Python program....166
k-nearest neighbor algorithm....167
Implementing the KNN algorithm....168
The knn_polysemy.py program....171
Implementing the KNN function in Google_Translate_Customized.py....173
Conclusions on the Google Translate customized experiment....181
The disruptive revolutionary loop....182
Summary....182
Questions....183
Further reading....183
Chapter 7: Optimizing Blockchains with Naive Bayes....186
Part I – the background to blockchain technology....187
Mining bitcoins....188
Using cryptocurrency....189
PART II – using blockchains to share information in a supply chain....190
Using blockchains in the supply chain network....193
Creating a block....194
Exploring the blocks....195
Part III – optimizing a supply chain with naive Bayes in a blockchain process....196
A naive Bayes example....196
The blockchain anticipation novelty....198
The goal – optimizing storage levels using blockchain data....199
Implementation of naive Bayes in Python....202
Gaussian naive Bayes....202
Summary....206
Questions....206
Further reading....207
Chapter 8: Solving the XOR Problem with a Feedforward Neural Network....208
The original perceptron could not solve the XOR function....209
XOR and linearly separable models....210
Linearly separable models....210
The XOR limit of a linear model, such as the original perceptron....211
Building an FNN from scratch....213
Step 1 – defining an FNN....213
Step 2 – an example of how two children can solve the XOR problem every day....214
Implementing a vintage XOR solution in Python with an FNN and backpropagation....218
A simplified version of a cost function and gradient descent....220
Linear separability was achieved....223
Applying the FNN XOR function to optimizing subsets of data....225
Summary....231
Questions....232
Further reading....232
Chapter 9: Abstract Image Classification with Convolutional Neural Networks (CNNs)....234
Introducing CNNs....235
Defining a CNN....236
Initializing the CNN....238
Adding a 2D convolution layer....239
Kernel....239
Shape....244
ReLU....244
Pooling....247
Next convolution and pooling layer....248
Flattening....249
Dense layers....249
Dense activation functions....250
Training a CNN model....250
The goal....251
Compiling the model....252
The loss function....252
The Adam optimizer....254
Metrics....255
The training dataset....255
Data augmentation....256
Loading the data....256
The testing dataset....257
Data augmentation on the testing dataset....257
Loading the data....257
Training with the classifier....258
Saving the model....259
Next steps....259
Summary....260
Questions....260
Further reading and references....260
Chapter 10: Conceptual Representation Learning....262
Generating profit with transfer learning....263
The motivation behind transfer learning....264
Inductive thinking....264
Inductive abstraction....264
The problem AI needs to solve....265
The gap concept....266
Loading the trained TensorFlow 2.x model....267
Loading and displaying the model....267
Loading the model to use it....271
Defining a strategy....274
Making the model profitable by using it for another problem....275
Domain learning....276
How to use the programs....276
The trained models used in this section....277
The trained model program....277
Gap – loaded or underloaded....278
Gap – jammed or open lanes....280
Gap datasets and subsets....282
Generalizing the (the gap conceptual dataset)....282
The motivation of conceptual representation learning metamodels applied to dimensionality....283
The curse of dimensionality....283
The blessing of dimensionality....284
Summary....285
Questions....286
Further reading....286
Chapter 11: Combining Reinforcement Learning and Deep Learning....288
Planning and scheduling today and tomorrow....289
A real-time manufacturing process....291
Amazon must expand its services to face competition....291
A real-time manufacturing revolution....292
CRLMM applied to an automated apparel manufacturing process....295
An apparel manufacturing process....296
Training the CRLMM....298
Generalizing the unit training dataset....298
Food conveyor belt processing – positive p and negative n gaps....299
Running a prediction program....303
Building the RL-DL-CRLMM....303
A circular process....304
Implementing a CNN-CRLMM to detect gaps and optimize....305
Q-learning – MDP....306
MDP inputs and outputs....307
The optimizer....310
The optimizer as a regulator....310
Finding the main target for the MDP function....313
A circular model – a stream-like system that never starts nor ends....315
Summary....320
Questions....320
Further reading....321
Chapter 12: AI and the Internet of Things (IoT)....322
The public service project....323
Setting up the RL-DL-CRLMM model....324
Applying the model of the CRLMM....326
The dataset....327
Using the trained model....329
Adding an SVM function....330
Motivation – using an SVM to increase safety levels....331
Definition of a support vector machine....332
Python function....334
Running the CRLMM....336
Finding a parking space....336
Deciding how to get to the parking lot....339
Support vector machine....340
The itinerary graph....342
The weight vector....343
Summary....344
Questions....345
Further reading....345
Chapter 13: Visualizing Networks with TensorFlow 2.x and TensorBoard....346
Exploring the output of the layers of a CNN in two steps with TensorFlow....347
Building the layers of a CNN....348
Processing the visual output of the layers of a CNN....352
Analyzing the visual output of the layers of a CNN....356
Analyzing the accuracy of a CNN using TensorBoard....363
Getting started with Google Colaboratory....363
Defining and training the model....365
Introducing some of the measurements....368
Summary....370
Questions....371
Further reading....371
Chapter 14: Preparing the Input of Chatbots with Restricted Boltzmann Machines (RBMs) and Principal Component Analysis (PCA)....372
Defining basic terms and goals....373
Introducing and building an RBM....374
The architecture of an RBM....375
An energy-based model....376
Building the RBM in Python....379
Creating a class and the structure of the RBM....379
Creating a training function in the RBM class....379
Computing the hidden units in the training function....380
Random sampling of the hidden units for the reconstruction and contractive divergence....381
Reconstruction....382
Contrastive divergence....383
Error and energy function....383
Running the epochs and analyzing the results....384
Using the weights of an RBM as feature vectors for PCA....386
Understanding PCA....391
Mathematical explanation....392
Using TensorFlow's Embedding Projector to represent PCA....396
Analyzing the PCA to obtain input entry points for a chatbot....399
Summary....401
Questions....402
Further reading....402
Chapter 15: Setting Up a Cognitive NLP UICUI Chatbot....404
Basic concepts....405
Defining NLU....405
Why do we call chatbots "agents"?....405
Creating an agent to understand Dialogflow....406
Entities....407
Intents....411
Context....416
Adding fulfillment functionality to an agent....421
Defining fulfillment....422
Enhancing the cogfilmdr agent with a fulfillment webhook....423
Getting the bot to work on your website....426
Machine learning agents....427
Using machine learning in a chatbot....427
Speech-to-text....427
Text-to-speech....428
Spelling....430
Why are these machine learning algorithms important?....432
Summary....433
Questions....434
Further reading....434
Chapter 16: Improve the Emotional Intelligence Deficiencies of Chatbots....436
From reacting to emotions, to creating emotions....437
Solving the problems of emotional polysemy....437
The greetings problem example....438
The affirmation example....439
The speech recognition fallacy....439
The facial analysis fallacy....440
Small talk....441
Courtesy....441
Emotions....444
Data logging....444
Creating emotions....447
RNN research for future automatic dialog generation....452
RNNs at work....453
RNN, LSTM, and vanishing gradients....454
Text generation with an RNN....455
Vectorizing the text....455
Building the model....456
Generating text....458
Summary....460
Questions....461
Further reading....461
Chapter 17: Genetic Algorithms in Hybrid Neural Networks....462
Understanding evolutionary algorithms....463
Heredity in humans....463
Our cells....464
How heredity works....464
Evolutionary algorithms....465
Going from a biological model to an algorithm....466
Basic concepts....466
Building a genetic algorithm in Python....469
Importing the libraries....469
Calling the algorithm....470
The main function....470
The parent generation process....471
Generating a parent....471
Fitness....472
Display parent....473
Crossover and mutation....474
Producing generations of children....476
Summary code....479
Unspecified target to optimize the architecture of a neural network with a genetic algorithm....480
A physical neural network....480
What is the nature of this mysterious S-FNN?....481
Calling the algorithm cell....482
Fitness cell....483
ga_main() cell....484
Artificial hybrid neural networks....485
Building the LSTM....486
The goal of the model....487
Summary....488
Questions....489
Further reading....489
Chapter 18: Neuromorphic Computing....490
Neuromorphic computing....491
Getting started with Nengo....492
Installing Nengo and Nengo GUI....493
Creating a Python program....495
A Nengo ensemble....495
Nengo neuron types....496
Nengo neuron dimensions....497
A Nengo node....497
Connecting Nengo objects....499
Visualizing data....499
Probes....504
Applying Nengo's unique approach to critical AI research areas....508
Summary....511
Questions....512
References....512
Further reading....512
Chapter 19: Quantum Computing....514
The rising power of quantum computers....515
Quantum computer speed....516
Defining a qubit....519
Representing a qubit....519
The position of a qubit....520
Radians, degrees, and rotations....521
The Bloch sphere....522
Composing a quantum score....523
Quantum gates with Quirk....523
A quantum computer score with Quirk....525
A quantum computer score with IBM Q....526
A thinking quantum computer....529
Representing our mind's concepts....529
Expanding MindX's conceptual representations....529
The MindX experiment....530
Preparing the data....530
Transformation functions – the situation function....530
Transformation functions – the quantum function....533
Creating and running the score....533
Using the output....535
Summary....536
Questions....536
Further reading....537
Appendix: Answers to the Questions....538
Chapter 1 – Getting Started with Next-Generation Artificial Intelligence through Reinforcement Learning....538
Chapter 2 – Building a Reward Matrix – Designing Your Datasets....540
Chapter 3 – Machine Intelligence – Evaluation Functions and Numerical Convergence....541
Chapter 4 – Optimizing Your Solutions with K-Means Clustering....542
Chapter 5 – How to Use Decision Trees to Enhance K-Means Clustering....544
Chapter 6 – Innovating AI with Google Translate....545
Chapter 7 – Optimizing Blockchains with Naive Bayes....547
Chapter 8 – Solving the XOR Problem with a Feedforward Neural Network....548
Chapter 9 – Abstract Image Classification with Convolutional Neural Networks (CNNs)....550
Chapter 10 – Conceptual Representation Learning....551
Chapter 11 – Combining Reinforcement Learning and Deep Learning....553
Chapter 12 – AI and the Internet of Things....554
Chapter 13 – Visualizing Networks with TensorFlow 2.x and TensorBoard....556
Chapter 14 – Preparing the Input of Chatbots with Restricted Boltzmann Machines (RBMs) and Principal Component Analysis (PCA)....557
Chapter 15 – Setting Up a Cognitive NLP UICUI Chatbot....558
Chapter 16 – Improve the Emotional Intelligence Deficiencies of Chatbots....559
Chapter 17 – Genetic Algorithms in Hybrid Neural Networks....560
Chapter 18 – Neuromorphic Computing....561
Chapter 19 – Quantum Computing....563
Other Books You May Enjoy....566
Index....570
Understand the fundamentals and develop your own AI solutions in this updated edition packed with many new examples
AI has the potential to replicate humans in every field. Artificial Intelligence By Example, Second Edition serves as a starting point for you to understand how AI is built, with the help of intriguing and exciting examples.
This book will make you an adaptive thinker and help you apply concepts to real-world scenarios. Using some of the most interesting AI examples, right from computer programs such as a simple chess engine to cognitive chatbots, you will learn how to tackle the machine you are competing with. You will study some of the most advanced machine learning models, understand how to apply AI to blockchain and Internet of Things (IoT), and develop emotional quotient in chatbots using neural networks such as recurrent neural networks (RNNs) and convolutional neural networks (CNNs).
This edition also has new examples for hybrid neural networks, combining reinforcement learning (RL) and deep learning (DL), chained algorithms, combining unsupervised learning with decision trees, random forests, combining DL and genetic algorithms, conversational user interfaces (CUI) for chatbots, neuromorphic computing, and quantum computing.
By the end of this book, you will understand the fundamentals of AI and have worked through a number of examples that will help you develop your AI solutions.
Developers and those interested in AI, who want to understand the fundamentals of Artificial Intelligence and implement them practically. Prior experience with Python programming and statistical knowledge is essential to make the most out of this book.