Fundamentals of Neural Networks

Fundamentals of Neural Networks
اسم المؤلف
Laurene Fausett
التاريخ
4 سبتمبر 2016
المشاهدات
التقييم
Loading...

Fundamentals of Neural Networks – Architectures, Algorithms, and Applications
Laurene Fausett
Contents
PREFACE
ACKNOWLEDGMENTS
CHAPTER 1 INTRODUCTION
1.1 Why Neural Networks, and Why Now? 1
1.2 What Is a Neural Net? 3
1.2.1 Artificial Neural Networks, 3
1.2.2 Biological Neural Networks, 5
1.3 Where Are Neural Nets Being Used? 7
1.3.1 Signal Processing, 7
1.3.2 Control, 8
1.3.3 Pattern Recognition, 8
1.3.4 Medicine, 9
1.3.5 Speech Production, 9
1.3.6 Speech Recognition, 10
1.3.7 Business, 11
1.4 How Are Neural Networks Used? 11
1.4.1 Typical Architectures, 12
1.4.2 Setting the Weights, 15
1.4.3 Common Activation Functions, 17
1.4.4 Summary of Notation, 20
viii Contents
1.5 Who Is Developing Neural Networks? 22
1.5.1 The 1940s: The Beginning of Neural Nets, 22
1.5.2 The 1950s and 1960s: The First Golden Age of
Neural Networks, 23
1.5.3 The 1970s: The Quiet Years, 24
1.5.4 The 1980s: Renewed Enthusiasm , 25
1.6 / When Neural Nets Began: the McCulloch-Pitts
ron 26
Architecture, 27
1.6.2 Algorithm, 28
-6.3 Applications, 30
1.7 Suggestions for Further Study 35
1.7.1 Readings, 35
1.7.2 Exercises, 37
CHAPTER 2 SIMPLE NEURAL NETS FOR PATTERN
CLASSIFICATION 39
2.1 General Discussion 39
j 2.1.1 Architecture, 40
2.1.2 Biases and Thresholds, 41
I 2.1 2.1. .3 4 Linear Data Separability, 43
Representation, 48 . Q
2.2 Hebb Net 48
j 2.2.1 Algorithm, 49
2.2.2 Application, 50
2.3f Perceptron 59
I 2.3.1 Architecture, 60
! 2.3.2 Algorithm, 61
I 2.3.3 Application , 62
‘ 2.3.4 Perceptron Learning Rule Convergence Theorem, 76
2.4 Adaline 80
2.4.1 Architecture, 81
2.4.2 Algorithm, 81
2.4.3 Applications, 82 ‘
2.4.4 Derivations, 86
2.4.5 Madaline, 88
2.5 Suggestions for Further Study 96
2.5.1 Readings, 96
2.5.2 Exercises. 97
2.5.3 Projects, 100Contents ix
CHAPTER 3 PATTERN ASSOCIATION 101
Training Algorithms for Pattern Association
3.1.1 Hebb Rulefor Pattern Association, 103
3.1.2 Delta Rulefor Pattern Association, 106
103
Heteroassociative Memory Neural Network
3.2.1 Architecture, 108
108
3.2.2 Application, 108
\ 3.3 Autoassociative Net 121 W J /
\ 3.3 3.3. .1 2 Architecture Algorithm, 122 , 121
3.3.3 Application, 122
3.3.4 Storage Capacity, 125
3.4 Iterative Autoassociative Net 129
3.4.1 Recurrent Linear Autoassociator, 130
3.4.2 Brain-State-in-a-Box, 131
3.4.3 Autoassociator With Threshold Function, 132
3.4.4 Discrete Hopfield Net, 135
3.5 Bidirectional Associative Memory (BAM) 140
3.5.1 Architecture, 141
3.5.2 Algorithm, 141
3.5.3 Application, 144
3.5.4 Analysis, 148
3.6 Suggestions for Further Study 149
3.6.1 Readings, 149
r 3.6.2 Exercises, 150
3.6.3 Projects, 152
CHAPTER 4 NEURAL NETWORKS BASED ON COMPETITION 156
4.1 Fixed-Weight Competitive Nets 158
4.1.1 Maxnet, 158
4.1.2 Mexican Hat, 160
4.1.3 Hamming Net, 164
4.2 Kohonen Self-Organizing Maps 169
4.2.1 Architecture, 169
4.2.2 Algorithm , 170
4.2.3 Application, 172
4.3 Learning Vector Quantization 187
4.3.1 Architecture, 187
4.3.2 Algorithm, 188
4.3.3 Application, 189
4.3.4 Variations, 192
1Contents
CHAPTER 5
CHAPTER 6
4.4 Counterpropagation 195
4.4.1 Full Counterpropagation, 196
4.4.2 Forward-Only Counterpropagation , 206
4.5 Suggestions For Further Study 211
4.5.1 Readings, 211
4.5.2 Exercises, 211
4.5.3 Projects, 214
ADAPTIVE RESONANCE THEORY 218
5.1 Introduction 218
5.1.1 Motivation, 218
5.1.2 Basic Architecture, 219
5.1.3 Basic Operation, 220
5.2 ART1 222
5.2.1 Architecture, 222
5.2.2 Algorithm, 225
5.2.3 Applications, 229
5.2.4 Analysis, 243
5.3 ART2 246
5.3.1 Architecture, 247
5.3.2 Algorithm, 250
5.3.3 Applications, 257
5.3.4 Analysis, 276
5.4 Suggestions for Further Study 283
5.4.1 Readings, 283
5.4.2 Exercises, 284
5.4.3 Projects, 287
BACKPROPAGATION NEURAL NET 289
6.1 Standard Backpropagation 289 ,
6.1.1 Architecture, 290
6.1.2 Algorithm, 290
, Applications, 300
6.2 Variations 305
6.2.1 Alternative Weight Update Procedures, 305
6.2.2 Alternative Activation FunctionS, 309
6.2.3 Strictly Local Backpropagation, 316
6.2.4 Number of Hidden Layers, 320
6.3 Theoretical Results 324
6.3.1 Derivation of Learning Rules, 324
6.3.2 Multilayer Neural Nets as Universal Approximators,
328Contents xi
CHAPTER 7
6.4 Suggestions for Further Study 330
6.4.1 Readings, 330
6.4.2 Exercises, 330
6.4.3 Projects, 332
A SAMPLER OF OTHER NEURAL NETS 334
7.1 Fixed Weight Nets for Constrained Optimization 335
7.1.1 Boltzmann Machine, 338
7.1.2 Continuous Hopfield Net, 348
7.1.3 Gaussian Machine, 357
7.1.4 Cauchy Machine, 359
7.2 A Few More Nets that Learn 362
7.2.1 Modified Hebbian Learning, 362
7.2.2 Boltzmann Machine with Learning, 367
7.2.3 Simple Recurrent Net, 372
7.2.4 Backpropagation in Time, 377
7.2.5 Backpropagation Training for Fully Recurrent Nets,
7.3 Adaptive Architectures 385
7.3.1 Probabilistic Neural Net, 385
7.3.2 Cascade Correlation, 390
7.4 Neocognitron 398
7.4.1 Architecture, 399
7.4.2 Algorithm, 407
7.5 Suggestions for Further Study 418
7.5.1 Readings, 418
7.5.2 Exercises, 418
7.5.3 Project, 420
GLOSSARY 422
REFERENCES 437
INDEX
كلمة سر فك الضغط : books-world.net
The Unzip Password : books-world.net

التعليقات

اترك تعليقاً

لن يتم نشر عنوان بريدك الإلكتروني. الحقول الإلزامية مشار إليها بـ *

Time limit is exhausted. Please reload CAPTCHA.