Today is a free download without charge Download

Intelligent Systems Reference Library - Achim Zielesny - From Curve Fitting to Machine Learning, 2nd Edition [2016, PDF, ENG]

Reply to topic
 
Author
Message

Omen ®

Longevity: 8 years 4 months

Posts: 181087

Торрент-статистика

Post 06-May-2016 21:00

[Quote]

From Curve Fitting to Machine Learning, 2nd Edition
Год издания: 2016
Автор: Achim Zielesny
Издательство: Springer
ISBN: 978-3319325446
Серия: Intelligent Systems Reference Library
Язык: Английский
Формат: PDF
Качество: Издательский макет или текст (eBook)
Интерактивное оглавление: Да
Количество страниц: 493
Описание: This successful book provides in its second edition an interactive and illustrative guide from two-dimensional curve fitting to multidimensional clustering and machine learning with neural networks or support vector machines. Along the way topics like mathematical optimization or evolutionary algorithms are touched. All concepts and ideas are outlined in a clear cut manner with graphically depicted plausibility arguments and a little elementary mathematics.
The major topics are extensively outlined with exploratory examples and applications. The primary goal is to be as illustrative as possible without hiding problems and pitfalls but to address them. The character of an illustrative cookbook is complemented with specific sections that address more fundamental questions like the relation between machine learning and human intelligence.
All topics are completely demonstrated with the computing platform Mathematica and the Computational Intelligence Packages (CIP), a high-level function library developed with Mathematica's programming language on top of Mathematica's algorithms. CIP is open-source and the detailed code used throughout the book is freely accessible.
The target readerships are students of (computer) science and engineering as well as scientific practitioners in industry and academia who deserve an illustrative introduction. Readers with programming skills may easily port or customize the provided code. "'From curve fitting to machine learning' is ... a useful book. ... It contains the basic formulas of curve fitting and related subjects and throws in, what is missing in so many books, the code to reproduce the results.
All in all this is an interesting and useful book both for novice as well as expert readers. For the novice it is a good introductory book and the expert will appreciate the many examples and working code". Leslie A. Piegl (Review of the first edition, 2012).

Примеры страниц

Оглавление

Contents
1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1
1.1 Motivation:Data,models andmolecular sciences . . . . . . . . . . . . . . . . 2
1.2 Optimization . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6
1.2.1 Calculus . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10
1.2.2 Iterative optimization . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14
1.2.3 Iterative local optimization . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16
1.2.4 Iterative global optimization . . . . . . . . . . . . . . . . . . . . . . . . . . . 20
1.2.5 Constrained iterative optimization . . . . . . . . . . . . . . . . . . . . . . 31
1.3 Model functions. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 37
1.3.1 Linearmodel functionswith one argument . . . . . . . . . . . . . . . 38
1.3.2 Non-linearmodel functionswith one argument . . . . . . . . . . . 40
1.3.3 Linear model functions with multiple arguments . . . . . . . . . . 41
1.3.4 Non-linear model functions with multiple arguments . . . . . . 43
1.3.5 Multiple model functions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 44
1.3.6 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 44
1.4 Data structures . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 45
1.4.1 Data for curve fitting . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 45
1.4.2 Data formachine learning . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 45
1.4.3 Inputs for clustering . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 47
1.4.4 Inspection, cleaning and splitting of data . . . . . . . . . . . . . . . . . 48
1.5 Scaling of data . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 54
1.6 Data errors . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 55
1.7 Regression versus classification tasks . . . . . . . . . . . . . . . . . . . . . . . . . . 56
1.8 The structure of CIP calculations. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 58
1.9 A note on reproducibility . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 59
2 Curve Fitting . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 61
2.1 Basics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 65
2.1.1 Fitting data . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 65
2.1.2 Useful quantities . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 66
2.1.3 Smoothing data . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 68
2.2 Evaluating the goodness of fit . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 70
2.3 How to guess amodel function . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 76
2.4 Problems and pitfalls. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 88
2.4.1 Parameters’ start values . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 89
2.4.2 How to search for parameters’ start values . . . . . . . . . . . . . . . 93
2.4.3 More difficult curve fitting problems . . . . . . . . . . . . . . . . . . . . 97
2.4.4 Inappropriatemodel functions . . . . . . . . . . . . . . . . . . . . . . . . . 107
2.5 Parameters’ errors . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 112
2.5.1 Correction of parameters’ errors . . . . . . . . . . . . . . . . . . . . . . . . 112
2.5.2 Confidence levels of parameters’ errors . . . . . . . . . . . . . . . . . . 113
2.5.3 Estimating the necessary number of data . . . . . . . . . . . . . . . . . 114
2.5.4 Large parameters’ errors and educated cheating . . . . . . . . . . . 118
2.5.5 Experimental errors and data transformation. . . . . . . . . . . . . . 132
2.6 Empirical enhancement of theoreticalmodel functions . . . . . . . . . . . 135
2.7 Data smoothingwith cubic splines . . . . . . . . . . . . . . . . . . . . . . . . . . . . 143
2.8 Cookbook recipes for curve fitting . . . . . . . . . . . . . . . . . . . . . . . . . . . . 154
3 Clustering . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 157
3.1 Basics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 160
3.2 Intuitive clustering. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 163
3.3 Clustering with a fixed number of clusters . . . . . . . . . . . . . . . . . . . . . . 178
3.4 Getting representatives . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 185
3.5 Cluster occupancies and the iris flower example . . . . . . . . . . . . . . . . . 194
3.6 White-spot analysis . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 206
3.7 Alternative clusteringwith ART-2a . . . . . . . . . . . . . . . . . . . . . . . . . . . . 209
3.8 Clustering and class predictions. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 220
3.9 Cookbook recipes for clustering . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 228
4 Machine Learning . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 229
4.1 Basics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 237
4.2 Machine learning methods . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 242
4.2.1 Multiple linear and polynomial regression (MLR, MPR) . . . 243
4.2.2 Three-layer feed-forward neural networks. . . . . . . . . . . . . . . . 246
4.2.3 Support vector machines (SVM) . . . . . . . . . . . . . . . . . . . . . . . 251
4.3 Evaluating the goodness of regression . . . . . . . . . . . . . . . . . . . . . . . . . 256
4.4 Evaluating the goodness of classification . . . . . . . . . . . . . . . . . . . . . . . 260
4.5 Regression: Entering non-linearity . . . . . . . . . . . . . . . . . . . . . . . . . . . . 264
4.6 Classification: Non-linear decision surfaces . . . . . . . . . . . . . . . . . . . . . 282
4.7 Ambiguous classification . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 286
4.8 Training and test set partitioning . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 298
4.8.1 Cluster representatives based selection . . . . . . . . . . . . . . . . . . 299
4.8.2 Iris flower classification revisited . . . . . . . . . . . . . . . . . . . . . . . 304
4.8.3 Adhesive kinetics regression revisited . . . . . . . . . . . . . . . . . . . 316
4.8.4 Design of experiment . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 320
4.8.5 Concluding remarks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 335
4.9 Comparativemachine learning . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 336
4.10 Relevance of input components and minimal models . . . . . . . . . . . . . 349
4.11 Pattern recognition . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 355
4.12 Technical optimization problems . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 373
4.13 Cookbook recipes for machine learning . . . . . . . . . . . . . . . . . . . . . . . . 378
4.14 Appendix - Collecting the pieces . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 380
5 Discussion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 407
5.1 Computers are about speed. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 407
5.2 Isn’t it just ...? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 417
5.2.1 ... optimization? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 418
5.2.2 ... data smoothing? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 418
5.3 Computational intelligence. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 429
5.4 Final remark. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 434
A CIP -Computational Intelligence Packages . . . . . . . . . . . . . . . . . . . . . . . . 437
A.1 Basics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 437
A.2 Experimental data . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 439
A.2.1 Temperature dependence of the viscosity of water . . . . . . . . . 439
A.2.2 Potential energy surface of hydrogen fluoride . . . . . . . . . . . . . 440
A.2.3 Kinetics data from time dependent IR spectra of the
hydrolysis of acetanhydride . . . . . . . . . . . . . . . . . . . . . . . . . . . . 441
A.2.4 Iris flowers. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 448
A.2.5 Adhesive kinetics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 449
A.2.6 Intertwined spirals . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 451
A.2.7 Faces . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 452
A.2.8 Wisconsin Diagnostic Breast Cancer (WDBC) data . . . . . . . . 455
A.2.9 Wisconsin Prognostic Breast Cancer (WPBC) data . . . . . . . . 456
A.2.10 QSPR data . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 457
A.3 Parallelized calculations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 458
References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 461
Index . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 467
Other forum [Profile] [PM]
Display posts from previous:    
Reply to topic

The time now is: Today 19:43

All times are GMT + 3 Hours



You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum
You cannot vote in polls in this forum
You cannot attach files in this forum
You cannot download files in this forum