Machine learning Books
O'Reilly Media LowCode AI
Book SynopsisThis hands-on guide presents three problem-focused ways to learn ML: no code using AutoML, low-code using BigQuery ML, and custom code using scikit-learn and Keras. You'll learn key ML concepts by using real-world datasets with realistic problems.
£47.99
John Wiley & Sons Inc A Data Scientists Guide to Acquiring Cleaning and
Book SynopsisThe only how-to guide offering a unified, systemic approach to acquiring, cleaning, and managing data in R Every experienced practitioner knows that preparing data for modeling is a painstaking, time-consuming process. Adding to the difficulty is that most modelers learn the steps involved in cleaning and managing data piecemeal, often on the fly, or they develop their own ad hoc methods. This book helps simplify their task by providing a unified, systematic approach to acquiring, modeling, manipulating, cleaning, and maintaining data in R. Starting with the very basics, data scientists Samuel E. Buttrey and Lyn R. Whitaker walk readers through the entire process. From what data looks like and what it should look like, they progress through all the steps involved in getting data ready for modeling. They describe best practices for acquiring data from numerous sources; explore key issues in data handling, including text/regular expressions, big data, paralTable of ContentsAbout the Authors xv Preface xvii Acknowledgments xix About the CompanionWebsite xxi 1 R 1 1.1 Introduction 1 1.1.1 What Is R? 1 1.1.2 Who Uses R and Why? 2 1.1.3 Acquiring and Installing R 2 1.1.4 Starting and Quitting R 3 1.2 Data 3 1.2.1 Acquiring Data 3 1.2.2 Cleaning Data 4 1.2.3 The Goal of Data Cleaning 4 1.2.4 Making YourWork Reproducible 5 1.3 The Very Basics of R 5 1.3.1 Top Ten Quick Facts You Need to Know about R 5 1.3.2 Vocabulary 8 1.3.3 Calculating and Printing in R 11 1.4 Running an R Session 12 1.4.1 Where Your Data Is Stored 13 1.4.2 Options 13 1.4.3 Scripts 14 1.4.4 R Packages 14 1.4.5 RStudio and Other GUIs 15 1.4.6 Locales and Character Sets 15 1.5 Getting Help 16 1.5.1 At the Command Line 16 1.5.2 The Online Manuals 16 1.5.3 On the Internet 17 1.5.4 Further Reading 17 1.6 How to Use This Book 17 1.6.1 Syntax and Conventions inThis Book 17 1.6.2 The Chapters 18 2 RData,Part1:Vectors 21 2.1 Vectors 21 2.1.1 Creating Vectors 21 2.1.2 Sequences 22 2.1.3 Logical Vectors 23 2.1.4 Vector Operations 24 2.1.5 Names 27 2.2 Data Types 27 2.2.1 Some Less-Common Data Types 28 2.2.2 What Type of Vector IsThis? 28 2.2.3 Converting from One Type to Another 29 2.3 Subsets of Vectors 31 2.3.1 Extracting 31 2.3.2 Vectors of Length 0 34 2.3.3 Assigning or Replacing Elements of a Vector 35 2.4 Missing Data (NA) and Other Special Values 36 2.4.1 The Effect of NAs in Expressions 37 2.4.2 Identifying and Removing or Replacing NAs 37 2.4.3 Indexing with NAs 39 2.4.4 NaN and Inf Values 40 2.4.5 NULL Values 40 2.5 The table() Function 40 2.5.1 Two- and Higher-Way Tables 42 2.5.2 Operating on Elements of a Table 42 2.6 Other Actions on Vectors 45 2.6.1 Rounding 45 2.6.2 Sorting and Ordering 45 2.6.3 Vectors as Sets 46 2.6.4 Identifying Duplicates and Matching 47 2.6.5 Finding Runs of Duplicate Values 49 2.7 Long Vectors and Big Data 50 2.8 Chapter Summary and Critical Data Handling Tools 50 3 R Data, Part 2:More Complicated Structures 53 3.1 Introduction 53 3.2 Matrices 53 3.2.1 Extracting and Assigning 54 3.2.2 Row and Column Names 56 3.2.3 Applying a Function to Rows or Columns 57 3.2.4 Missing Values in Matrices 59 3.2.5 Using a Matrix Subscript 60 3.2.6 Sparse Matrices 61 3.2.7 Three- and Higher-Way Arrays 62 3.3 Lists 62 3.3.1 Extracting and Assigning 64 3.3.2 Lists in Practice 65 3.4 Data Frames 67 3.4.1 Missing Values in Data Frames 69 3.4.2 Extracting and Assigning in Data Frames 69 3.4.3 ExtractingThings That Aren’tThere 72 3.5 Operating on Lists and Data Frames 74 3.5.1 Split, Apply, Combine 75 3.5.2 All-Numeric Data Frames 77 3.5.3 Convenience Functions 78 3.5.4 Re-Ordering, De-Duplicating, and Sampling from Data Frames 79 3.6 Date and Time Objects 80 3.6.1 Formatting Dates 80 3.6.2 Common Operations on Date Objects 82 3.6.3 Differences between Dates 83 3.6.4 Dates and Times 83 3.6.5 Creating POSIXt Objects 85 3.6.6 Mathematical Functions for Date and Times 86 3.6.7 Missing Values in Dates 88 3.6.8 Using Apply Functions with Dates and Times 89 3.7 Other Actions on Data Frames 90 3.7.1 Combining by Rows or Columns 90 3.7.2 Merging Data Frames 91 3.7.3 Comparing Two Data Frames 94 3.7.4 Viewing and Editing Data Frames Interactively 94 3.8 Handling Big Data 94 3.9 Chapter Summary and Critical Data Handling Tools 96 4 RData, Part 3: Text and Factors 99 4.1 Character Data 100 4.1.1 The length() and nchar() Functions 100 4.1.2 Tab, New-Line, Quote, and Backslash Characters 100 4.1.3 The Empty String 101 4.1.4 Substrings 102 4.1.5 Changing Case and Other Substitutions 103 4.2 Converting Numbers into Text 103 4.2.2 Scientific Notation 106 4.2.3 Discretizing a Numeric Variable 107 4.3 Constructing Character Strings: Paste in Action 109 4.3.1 Constructing Column Names 109 4.3.2 Tabulating Dates by Year and Month or Quarter Labels 111 4.3.3 Constructing Unique Keys 112 4.3.4 Constructing File and Path Names 112 4.4 Regular Expressions 112 4.4.1 Types of Regular Expressions 113 4.4.2 Tools for Regular Expressions in R 113 4.4.3 Special Characters in Regular Expressions 114 4.4.4 Examples 114 4.4.5 The regexpr() Function and Its Variants 121 4.4.6 Using Regular Expressions in Replacement 123 4.4.7 Splitting Strings at Regular Expressions 124 4.4.8 Regular Expressions versusWildcard Matching 125 4.4.9 Common Data Cleaning Tasks Using Regular Expressions 126 4.4.10 Documenting and Debugging Regular Expressions 127 4.5 UTF-8 and Other Non-ASCII Characters 128 4.5.1 Extended ASCII for Latin Alphabets 128 4.5.2 Non-Latin Alphabets 129 4.5.3 Character and String Encoding in R 130 4.6 Factors 131 4.6.1 What Is a Factor? 131 4.6.2 Factor Levels 132 4.6.3 Converting and Combining Factors 134 4.6.4 Missing Values in Factors 136 4.6.5 Factors in Data Frames 137 4.7 R Object Names and Commands as Text 137 4.7.1 R Object Names as Text 137 4.7.2 R Commands as Text 138 4.8 Chapter Summary and Critical Data Handling Tools 140 5 Writing Functions and Scripts 143 5.1 Functions 143 5.1.1 Function Arguments 144 5.1.2 Global versus Local Variables 148 5.1.3 Return Values 149 5.1.4 Creating and Editing Functions 151 5.2 Scripts and Shell Scripts 153 5.2.1 Line-by-Line Parsing 155 5.3 Error Handling and Debugging 156 5.3.1 Debugging Functions 156 5.3.2 Issuing Error andWarning Messages 158 5.3.3 Catching and Processing Errors 159 5.4 Interacting with the Operating System 161 5.4.1 File and Directory Handling 162 5.4.2 Environment Variables 162 5.5 SpeedingThings Up 163 5.5.1 Profiling 163 5.5.2 Vectorizing Functions 164 5.5.3 Other Techniques to Speed Things Up 165 5.6 Chapter Summary and Critical Data Handling Tools 167 5.6.1 Programming Style 168 5.6.2 Common Bugs 169 5.6.3 Objects, Classes, and Methods 170 6 Getting Data into and out of R 171 6.1 Reading Tabular ASCII Data into Data Frames 171 6.1.1 Files with Delimiters 172 6.1.2 Column Classes 173 6.1.3 Common Pitfalls in Reading Tables 175 6.1.4 An Example of When read.table() Fails 177 6.1.5 Other Uses of the scan() Function 181 6.1.6 Writing Delimited Files 182 6.1.7 Reading andWriting Fixed-Width Files 183 6.1.8 A Note on End-of-Line Characters 183 6.2 Reading Large, Non-Tabular, or Non-ASCII Data 184 6.2.1 Opening and Closing Files 184 6.2.2 Reading andWriting Lines 185 6.2.3 Reading andWriting UTF-8 and Other Encodings 187 6.2.4 The Null Character 187 6.2.5 Binary Data 188 6.2.6 Reading Problem Files in Action 190 6.3 Reading Data From Relational Databases 192 6.3.1 Connecting to the Database Server 193 6.3.2 Introduction to SQL 194 6.4 Handling Large Numbers of Input Files 197 6.5 Other Formats 200 6.5.1 Using the Clipboard 200 6.5.2 Reading Data from Spreadsheets 201 6.5.3 Reading Data from theWeb 203 6.5.4 Reading Data from Other Statistical Packages 208 6.6 Reading andWriting R Data Directly 209 6.7 Chapter Summary and Critical Data Handling Tools 210 7 Data Handling in Practice 213 7.1 Acquiring and Reading Data 213 7.2 Cleaning Data 214 7.3 Combining Data 216 7.3.1 Combining by Row 216 7.3.2 Combining by Column 218 7.3.3 Merging by Key 218 7.4 Transactional Data 219 7.4.1 Example of Transactional Data 219 7.4.2 Combining Tabular and Transactional Data 221 7.5 Preparing Data 225 7.6 Documentation and Reproducibility 226 7.7 The Role of Judgment 228 7.8 Data Cleaning in Action 230 7.8.1 Reading and Cleaning BedBath1.csv 231 7.8.2 Reading and Cleaning BedBath2.csv 236 7.8.3 Combining the BedBath Data Frames 238 7.8.4 Reading and Cleaning EnergyUsage.csv 239 7.8.5 Merging the BedBath and EnergyUsage Data Frames 242 7.9 Chapter Summary and Critical Data Handling Tools 245 8 Extended Exercise 247 8.1 Introduction to the Problem 247 8.1.1 The Goal 248 8.1.2 Modeling Considerations 249 8.1.3 Examples ofThings to Check 249 8.2 The Data 250 8.3 Five Important Fields 252 8.4 Loan and Application Portfolios 252 8.4.1 Layout of the Beachside Lenders Data 253 8.4.2 Layout of theWilson and Sons Data 254 8.4.3 Combining the Two Portfolios 254 8.5 Scores 256 8.5.1 Scores Layout 256 8.6 Co-borrower Scores 257 8.6.1 Co-borrower Score Examples 258 8.7 Updated KScores 259 8.7.1 Updated KScores Layout 259 8.8 Loans to Be Excluded 260 8.8.1 Sample Exclusion File 260 8.9 Response Variable 260 8.10 Assembling the Final Data Sets 262 8.10.1 Final Data Layout 262 8.10.2 Concluding Remarks 263 A Hints and Pseudocode 265 A.1 Loan Portfolios 265 A.1.1 Things to Check 266 A.2 Scores Database 267 A.2.1 Things to Check 268 A.3 Co-borrower Scores 269 A.3.1 Things to Check 270 A.4 Updated KScores 271 A.4.1 Things to Check 272 A.5 Excluder Files 272 A.5.1 Things to Check 272 A.6 Payment Matrix 273 A.6.1 Things to Check 274 A.7 Starting the Modeling Process 275 Bibliography 277 Index 279
£49.46
John Wiley & Sons Inc The Big RBook
Book SynopsisIntroduces professionals and scientists to statistics and machine learning using the programming language R Written by and for practitioners, this book provides an overall introduction to R, focusing on tools and methods commonly used in data science, and placing emphasis on practice and business use. It covers a wide range of topics in a single volume, including big data, databases, statistical machine learning, data wrangling, data visualization, and the reporting of results. The topics covered are all important for someone with a science/math background that is looking to quickly learn several practical technologies to enter or transition to the growing field of data science. The Big R-Book for Professionals: From Data Science to Learning Machines and Reporting with R includes nine parts, starting with an introduction to the subject and followed by an overview of R and elements of statistics. The third part revolves around data, while the fourth focuseTable of ContentsForeword xxv About the Author xxvii Acknowledgements xxix Preface xxxi About the Companion Site xxxv I Introduction 1 1 The Big Picture with Kondratiev and Kardashev 3 2 The Scientific Method and Data 7 3 Conventions 11 II Starting with R and Elements of Statistics 19 4 The Basics of R 21 4.1 Getting Started with R 23 4.2 Variables 26 4.3 Data Types 28 4.3.1 The Elementary Types 28 4.3.2 Vectors 29 4.3.3 Accessing Data from a Vector 29 4.3.4 Matrices 32 4.3.5 Arrays 38 4.3.6 Lists 41 4.3.7 Factors 45 4.3.8 Data Frames 49 4.3.9 Strings or the Character-type 54 4.4 Operators 57 4.4.1 Arithmetic Operators 57 4.4.2 Relational Operators 57 4.4.3 Logical Operators 58 4.4.4 Assignment Operators 59 4.4.5 Other Operators 61 4.5 Flow Control Statements 63 4.5.1 Choices 63 4.5.2 Loops 65 4.6 Functions 69 4.6.1 Built-in Functions 69 4.6.2 Help with Functions 69 4.6.3 User-defined Functions 70 4.6.4 Changing Functions 70 4.6.5 Creating Function with Default Arguments 71 4.7 Packages 72 4.7.1 Discovering Packages in R 72 4.7.2 Managing Packages in R 73 4.8 Selected Data Interfaces 75 4.8.1 CSV Files 75 4.8.2 Excel Files 79 4.8.3 Databases 79 5 Lexical Scoping and Environments 81 5.1 Environments in R 81 5.2 Lexical Scoping in R 83 6 The Implementation of OO 87 6.1 Base Types 89 6.2 S3 Objects 91 6.2.1 Creating S3 Objects 94 6.2.2 Creating Generic Methods 96 6.2.3 Method Dispatch 97 6.2.4 Group Generic Functions 98 6.3 S4 Objects 100 6.3.1 Creating S4 Objects 100 6.3.2 Using S4 Objects 101 6.3.3 Validation of Input 105 6.3.4 Constructor functions 107 6.3.5 The Data slot 108 6.3.6 Recognising Objects, Generic Functions, and Methods 108 6.3.7 CreatingS4Generics 110 6.3.8 Method Dispatch 111 6.4 The Reference Class, refclass, RC or R5 Model 113 6.4.1 Creating RC Objects 113 6.4.2 Important Methods and Attributes 117 6.5 Conclusions about the OO Implementation 119 7 Tidy R with the Tidyverse 121 7.1 The Philosophy of the Tidyverse 121 7.2 Packages in the Tidyverse 124 7.2.1 The Core Tidyverse 124 7.2.2 The Non-core Tidyverse 125 7.3 Working with the Tidyverse 127 7.3.1 Tibbles 127 7.3.2 Piping with R 132 7.3.3 Attention Points When Using the Pipe 133 7.3.4 Advanced Piping 134 7.3.5 Conclusion 137 8 Elements of Descriptive Statistics 139 8.1 Measures of Central Tendency 139 8.1.1 Mean 139 8.1.2 The Median 142 8.1.3 The Mode 143 8.2 Measures of Variation or Spread 145 8.3 Measures of Covariation 147 8.3.1 The Pearson Correlation 147 8.3.2 The Spearman Correlation 148 8.3.3 Chi-square Tests 149 8.4 Distributions 150 8.4.1 Normal Distribution 150 8.4.2 Binomial Distribution 153 8.5 Creating an Overview of Data Characteristics 155 9 Visualisation Methods 159 9.1 Scatterplots 161 9.2 Line Graphs 163 9.3 Pie Charts 165 9.4 Bar Charts 167 9.5 Boxplots 171 9.6 Violin Plots 173 9.7 Histograms 176 9.8 Plotting Functions 179 9.9 Maps and Contour Plots 180 9.10 Heat-maps 181 9.11 Text Mining 184 9.11.1 Word Clouds 184 9.11.2 Word Associations 188 9.12 Colours in R 191 10 Time Series Analysis 197 10.1 Time Series in R 197 10.1.1 The Basics of Time Series in R 197 10.2 Forecasting 200 10.2.1 Moving Average 200 10.2.2 Seasonal Decomposition 206 11 Further Reading 211 III Data Import 213 12 A Short History of Modern Database Systems 215 13 RDBMS 219 14 SQL 223 14.1 Designing the Database 223 14.2 Building the Database Structure 226 14.2.1 Installing a RDBMS 226 14.2.2 Creating the Database 228 14.2.3 Creating the Tables and Relations 229 14.3 Adding Data to the Database 235 14.4 Querying the Database 239 14.4.1 The Basic Select Query 239 14.4.2 More Complex Queries 240 14.5 Modifying the Database Structure 244 14.6 Selected Features of SQL 249 14.6.1 Changing Data 249 14.6.2 Functions in SQL 249 15 Connecting R to an SQL Database 253 IV Data Wrangling 257 16 Anonymous Data 261 17 Data Wrangling in the tidyverse 265 17.1 Importing the Data 266 17.1.1 Importing from an SQLRDBMS 266 17.1.2 Importing Flat Files in the Tidyverse 267 17.2 Tidy Data 275 17.3 Tidying Up Data with tidyr 277 17.3.1 Splitting Tables 278 17.3.2 Convert Headers to Data 281 17.3.3 Spreading One Column Over Many 284 17.3.4 Split One Columns into Many 285 17.3.5 Merge Multiple Columns Into One 286 17.3.6 Wrong Data 287 17.4 SQL-like Functionality via dplyr 288 17.4.1 Selecting Columns 288 17.4.2 Filtering Rows 289 17.4.3 Joining 290 17.4.4 Mutating Data 293 17.4.5 Set Operations 296 17.5 String Manipulation in the tidyverse 299 17.5.1 Basic String Manipulation 300 17.5.2 Pattern Matching with Regular Expressions 302 17.6 Dates with lubridate 314 17.6.1 ISO 8601 Format 315 17.6.2 Time-zones 317 17.6.3 Extract Date and Time Components 318 17.6.4 Calculating with Date-times 319 17.7 Factors with Forcats 325 18 Dealing with Missing Data 333 18.1 Reasons for Data to be Missing 334 18.2 Methods to Handle Missing Data 336 18.2.1 Alternative Solutions to Missing Data 336 18.2.2 Predictive Mean Matching(PMM) 338 18.3 R Packages to Deal with Missing Data 339 18.3.1 mice 339 18.3.2 missForest 340 18.3.3 Hmisc 341 19 Data Binning 343 19.1 What is Binning and Why Use It 343 19.2 Tuning the Binning Procedure 347 19.3 More Complex Cases: Matrix Binning 352 19.4 Weight of Evidence and Information Value 359 19.4.1 Weight of Evidence(WOE) 359 19.4.2 Information Value(IV) 359 19.4.3 WOE and IV in R 359 20 Factoring Analysis and Principle Components 363 20.1 Principle Components Analysis (PCA) 364 20.2 Factor Analysis 368 V Modelling 373 21 Regression Models 375 21.1 Linear Regression 375 21.2 Multiple Linear Regression 379 21.2.1 Poisson Regression 379 21.2.2 Non-linear Regression 381 21.3 Performance of Regression Models 384 21.3.1 Mean Square Error (MSE) 384 21.3.2 R-Squared 384 21.3.3 Mean Average Deviation(MAD) 386 22 Classification Models 387 22.1 Logistic Regression 388 22.2 Performance of Binary Classification Models 390 22.2.1 The Confusion Matrix and Related Measures 391 22.2.2 ROC 393 22.2.3 The AUC 396 22.2.4 The Gini Coefficient 397 22.2.5 Kolmogorov-Smirnov (KS) for Logistic Regression 398 22.2.6 Finding an Optimal Cut-off 399 23 Learning Machines 405 23.1 Decision Tree 407 23.1.1 Essential Background 407 23.1.2 Important Considerations 412 23.1.3 Growing Trees with the Package rpart 414 23.1.4 Evaluating the Performance of a Decision Tree 424 23.2 Random Forest 428 23.3 Artificial Neural Networks (ANNs) 434 23.3.1 The Basics of ANNs in R 434 23.3.2 Neural Networks in R 436 23.3.3 The Work-flow to for Fitting a NN 438 23.3.4 Cross Validate the NN 444 23.4 Support Vector Machine 447 23.4.1 Fitting a SVM in R 447 23.4.2 Optimizing the SVM 449 23.5 Unsupervised Learning and Clustering 450 23.5.1 k-Means Clustering 450 23.5.2 Visualizing Clusters in Three Dimensions 462 23.5.3 Fuzzy Clustering 464 23.5.4 Hierarchical Clustering 466 23.5.5 Other Clustering Methods 468 24 Towards a Tidy Modelling Cycle with modelr 469 24.1 Adding Predictions 470 24.2 Adding Residuals 471 24.3 Bootstrapping Data 472 24.4 Other Functions of modelr 474 25 Model Validation 475 25.1 Model Quality Measures 476 25.2 Predictions and Residuals 477 25.3 Bootstrapping 479 25.3.1 Bootstrapping in Base R 479 25.3.2 Bootstrapping in the tidyverse with modelr 481 25.4 Cross-Validation 483 25.4.1 Elementary Cross Validation 483 25.4.2 Monte Carlo Cross Validation 486 25.4.3 k-Fold Cross Validation 488 25.4.4 Comparing Cross Validation Methods 489 25.5 Validation in a Broader Perspective 492 26 Labs 495 26.1 Financial Analysis with quantmod 495 26.1.1 The Basics of quantmod 495 26.1.2 Types of Data Available in quantmod 496 26.1.3 Plotting with quantmod 497 26.1.4 The quantmod Data Structure 500 26.1.5 Support Functions Supplied by quantmod 502 26.1.6 Financial Modelling in quantmod 504 27 Multi Criteria Decision Analysis (MCDA) 511 27.1 What and Why 511 27.2 General Work-flow 513 27.3 Identify the Issue at Hand: Steps 1 and 2 516 27.4 Step3: the Decision Matrix 518 27.4.1 Construct a Decision Matrix 518 27.4.2 Normalize the Decision Matrix 520 27.5 Step 4: Delete Inefficient and Unacceptable Alternatives 521 27.5.1 Unacceptable Alternatives 521 27.5.2 Dominance – Inefficient Alternatives 521 27.6 Plotting Preference Relationships 524 27.7 Step5: MCDA Methods 526 27.7.1 Examples of Non-compensatory Methods 526 27.7.2 The Weighted Sum Method(WSM) 527 27.7.3 Weighted Product Method(WPM) 530 27.7.4 ELECTRE 530 27.7.5 PROMethEE 540 27.7.6 PCA(Gaia) 553 27.7.7 Outranking Methods 557 27.7.8 Goal Programming 558 27.8 Summary MCDA 561 VI Introduction to Companies 563 28 Financial Accounting (FA) 567 28.1 The Statements of Accounts 568 28.1.1 Income Statement 568 28.1.2 Net Income: The P&L statement 568 28.1.3 Balance Sheet 569 28.2 The Value Chain 571 28.3 Further, Terminology 573 28.4 Selected Financial Ratios 575 29 Management Accounting 583 29.1 Introduction 583 29.1.1 Definition of Management Accounting (MA) 583 29.1.2 Management Information Systems (MIS) 584 29.2 Selected Methods in MA 585 29.2.1 Cost Accounting 585 29.2.2 Selected Cost Types 587 29.3 Selected Use Cases of MA 590 29.3.1 Balanced Scorecard 590 29.3.2 Key Performance Indicators (KPIs) 591 30 Asset Valuation Basics 597 30.1 Time Value of Money 598 30.1.1 Interest Basics 598 30.1.2 Specific Interest Rate Concepts 598 30.1.3 Discounting 600 30.2 Cash 601 30.3 Bonds 602 30.3.1 Features of a Bond 602 30.3.2 Valuation of Bonds 604 30.3.3 Duration 606 30.4 The Capital Asset Pricing Model (CAPM) 610 30.4.1 The CAPM Framework 610 30.4.2 The CAPM and Risk 612 30.4.3 Limitations and Shortcomings of the CAPM 612 30.5 Equities 614 30.5.1 Definition 614 30.5.2 Short History 614 30.5.3 Valuation of Equities 615 30.5.4 Absolute Value Models 616 30.5.5 Relative Value Models 625 30.5.6 Selection of Valuation Methods 630 30.5.7 Pitfalls in Company Valuation 631 30.6 Forwards and Futures 638 30.7 Options 640 30.7.1 Definitions 640 30.7.2 Commercial Aspects 642 30.7.3 Short History 643 30.7.4 Valuation of Options at Maturity 644 30.7.5 The Black and Scholes Model 649 30.7.6 The Binomial Model 654 30.7.7 Dependencies of the Option Price 660 30.7.8 The Greeks 664 30.7.9 Delta Hedging 665 30.7.10 Linear Option Strategies 667 30.7.11 Integrated Option Strategies 674 30.7.12 Exotic Options 678 30.7.13 Capital Protected Structures 680 VII Reporting 683 31 A Grammar of Graphics with ggplot2 687 31.1 TheBasicsofggplot2 688 31.2 Over-plotting 692 31.3 CaseStudyforggplot2 696 32 R Markdown 699 33 knitr and LATEX 703 34 An Automated Development Cycle 707 35 Writing and Communication Skills 709 36 Interactive Apps 713 36.1 Shiny 715 36.2 Browser Born Data Visualization 719 36.2.1 HTML-widgets 719 36.2.2 Interactive Maps with leaflet 720 36.2.3 Interactive Data Visualisation with ggvis 721 36.2.4 googleVis 723 36.3 Dashboards 725 36.3.1 The Business Case: a Diversity Dashboard 726 36.3.2 A Dashboard with flexdashboard 731 36.3.3 A Dashboard with shinydashboard 737 VIII Bigger and Faster R 741 37 Parallel Computing 743 37.1 Combine foreach and doParallel 745 37.2 Distribute Calculations over LAN with Snow 748 37.3 Using the GPU 752 37.3.1 Getting Started with gpuR 754 37.3.2 On the Importance of Memory use 757 37.3.3 Conclusions for GPU Programming 759 38 R and Big Data 761 38.1 Use a Powerful Server 763 38.1.1 Use R on a Server 763 38.1.2 Let the Database Server do the Heavy Lifting 763 38.2 Using more Memory than we have RAM 765 39 Parallelism for Big Data 767 39.1 Apache Hadoop 769 39.2 Apache Spark 771 39.2.1 Installing Spark 771 39.2.2 Running Spark 773 39.2.3 SparkR 776 39.2.4 sparklyr 788 39.2.5 SparkR or sparklyr 791 40 The Need for Speed 793 40.1 Benchmarking 794 40.2 Optimize Code 797 40.2.1 Avoid Repeating the Same 797 40.2.2 Use Vectorisation where Appropriate 797 40.2.3 Pre-allocating Memory 799 40.2.4 Use the Fastest Function 800 40.2.5 Use the Fastest Package 801 40.2.6 Be Mindful about Details 802 40.2.7 Compile Functions 804 40.2.8 Use C or C++ Code in R 806 40.2.9 Using a C++ Source File in R 809 40.2.10CallCompiledC++Functions in R 811 40.3 Profiling Code 812 40.3.1 The Package profr 813 40.3.2 The Package proftools 813 40.4 Optimize Your Computer 817 IX Appendices 819 A Create your own R Package 821 A.1 Creating the Package in the R Console 823 A.2 Update the Package Description 825 A.3 Documenting the Functionsxs 826 A.4 Loading the Package 827 A.5 Further Steps 828 B Levels of Measurement 829 B.1 Nominal Scale 829 B.2 Ordinal Scale 830 B.3 Interval Scale 831 B.4 Ratio Scale 832 C Trademark Notices 833 C.1 General Trademark Notices 834 C.2 R-Related Notices 835 C.2.1 Crediting Developers of R Packages 835 C.2.2 The R-packages used in this Book 835 D Code Not Shown in the Body of the Book 839 E Answers to Selected Questions 845 Bibliography 859 Nomenclature 869 Index 881
£93.56
John Wiley & Sons Inc AWS Certified Machine Learning Study Guide
Book SynopsisTable of ContentsIntroduction xvii Assessment Test xxix Answers to Assessment Test xxxv Part I Introduction 1 Chapter 1 AWS AI ML Stack 3 Amazon Rekognition 4 Image and Video Operations 6 Amazon Textract 10 Sync and Async APIs 11 Amazon Transcribe 13 Transcribe Features 13 Transcribe Medical 14 Amazon Translate 15 Amazon Translate Features 16 Amazon Polly 17 Amazon Lex 19 Lex Concepts 19 Amazon Kendra 21 How Kendra Works 22 Amazon Personalize 23 Amazon Forecast 27 Forecasting Metrics 30 Amazon Comprehend 32 Amazon CodeGuru 33 Amazon Augmented AI 34 Amazon SageMaker 35 Analyzing and Preprocessing Data 36 Training 39 Model Inference 40 AWS Machine Learning Devices 42 Summary 43 Exam Essentials 43 Review Questions 44 Chapter 2 Supporting Services from the AWS Stack 49 Storage 50 Amazon S3 50 Amazon EFS 52 Amazon FSx for Lustre 52 Data Versioning 53 Amazon VPC 54 AWS Lambda 56 AWS Step Functions 59 AWS RoboMaker 60 Summary 62 Exam Essentials 62 Review Questions 63 Part II Phases of Machine Learning Workloads 67 Chapter 3 Business Understanding 69 Phases of ML Workloads 70 Business Problem Identification 71 Summary 72 Exam Essentials 73 Review Questions 74 Chapter 4 Framing a Machine Learning Problem 77 ML Problem Framing 78 Recommended Practices 80 Summary 81 Exam Essentials 81 Review Questions 82 Chapter 5 Data Collection 85 Basic Data Concepts 86 Data Repositories 88 Data Migration to AWS 89 Batch Data Collection 89 Streaming Data Collection 92 Summary 96 Exam Essentials 96 Review Questions 98 Chapter 6 Data Preparation 101 Data Preparation Tools 102 SageMaker Ground Truth 102 Amazon EMR 104 Amazon SageMaker Processing 105 AWS Glue 105 Amazon Athena 107 Redshift Spectrum 107 Summary 107 Exam Essentials 107 Review Questions 109 Chapter 7 Feature Engineering 113 Feature Engineering Concepts 114 Feature Engineering for Tabular Data 114 Feature Engineering for Unstructured and Time Series Data 119 Feature Engineering Tools on AWS 120 Summary 121 Exam Essentials 121 Review Questions 123 Chapter 8 Model Training 127 Common ML Algorithms 128 Supervised Machine Learning 129 Textual Data 138 Image Analysis 141 Unsupervised Machine Learning 142 Reinforcement Learning 146 Local Training and Testing 147 Remote Training 149 Distributed Training 150 Monitoring Training Jobs 154 Amazon CloudWatch 155 AWS CloudTrail 155 Amazon Event Bridge 158 Debugging Training Jobs 158 Hyperparameter Optimization 159 Summary 162 Exam Essentials 162 Review Questions 164 Chapter 9 Model Evaluation 167 Experiment Management 168 Metrics and Visualization 169 Metrics in AWS AI/ML Services 173 Summary 174 Exam Essentials 175 Review Questions 176 Chapter 10 Model Deployment and Inference 181 Deployment for AI Services 182 Deployment for Amazon SageMaker 184 SageMaker Hosting: Under the Hood 184 Advanced Deployment Topics 187 Autoscaling Endpoints 187 Deployment Strategies 188 Testing Strategies 190 Summary 191 Exam Essentials 191 Review Questions 192 Chapter 11 Application Integration 195 Integration with On-Premises Systems 196 Integration with Cloud Systems 198 Integration with Front-End Systems 200 Summary 200 Exam Essentials 201 Review Questions 202 Part III Machine Learning Well-Architected Lens 205 Chapter 12 Operational Excellence Pillar for ML 207 Operational Excellence on AWS 208 Everything as Code 209 Continuous Integration and Continuous Delivery 210 Continuous Monitoring 213 Continuous Improvement 214 Summary 215 Exam Essentials 215 Review Questions 217 Chapter 13 Security Pillar 221 Security and AWS 222 Data Protection 223 Isolation of Compute 224 Fine-Grained Access Controls 225 Audit and Logging 226 Compliance Scope 227 Secure SageMaker Environments 228 Authentication and Authorization 228 Data Protection 231 Network Isolation 232 Logging and Monitoring 233 Compliance Scope 235 AI Services Security 235 Summary 236 Exam Essentials 236 Review Questions 238 Chapter 14 Reliability Pillar 241 Reliability on AWS 242 Change Management for ML 242 Failure Management for ML 245 Summary 246 Exam Essentials 246 Review Questions 247 Chapter 15 Performance Efficiency Pillar for ML 251 Performance Efficiency for ML on AWS 252 Selection 253 Review 254 Monitoring 255 Trade-offs 256 Summary 257 Exam Essentials 257 Review Questions 258 Chapter 16 Cost Optimization Pillar for ML 261 Common Design Principles 262 Cost Optimization for ML Workloads 263 Design Principles 263 Common Cost Optimization Strategies 264 Summary 266 Exam Essentials 266 Review Questions 267 Chapter 17 Recent Updates in the AWS AI/ML Stack 271 New Services and Features Related to AI Services 272 New Services 272 New Features of Existing Services 275 New Features Related to Amazon SageMaker 279 Amazon SageMaker Studio 279 Amazon SageMaker Data Wrangler 279 Amazon SageMaker Feature Store 280 Amazon SageMaker Clarify 281 Amazon SageMaker Autopilot 282 Amazon SageMaker JumpStart 283 Amazon SageMaker Debugger 283 Amazon SageMaker Distributed Training Libraries 284 Amazon SageMaker Pipelines and Projects 284 Amazon SageMaker Model Monitor 284 Amazon SageMaker Edge Manager 285 Amazon SageMaker Asynchronous Inference 285 Summary 285 Exam Essentials 285 Appendix Answers to the Review Questions 287 Chapter 1: AWS AI ML Stack 288 Chapter 2: Supporting Services from the AWS Stack 289 Chapter 3: Business Understanding 290 Chapter 4: Framing a Machine Learning Problem 291 Chapter 5: Data Collection 291 Chapter 6: Data Preparation 292 Chapter 7: Feature Engineering 293 Chapter 8: Model Training 294 Chapter 9: Model Evaluation 295 Chapter 10: Model Deployment and Inference 295 Chapter 11: Application Integration 296 Chapter 12: Operational Excellence Pillar for ML 297 Chapter 13: Security Pillar 298 Chapter 14: Reliability Pillar 298 Chapter 15: Performance Efficiency Pillar for ML 299 Chapter 16: Cost Optimization Pillar for ML 300 Index 303
£35.62
John Wiley & Sons Inc Not with a Bug But with a Sticker
Book SynopsisTable of ContentsForeword xv Introduction xix Chapter 1: Do You Want to Be Part of the Future? 1 Business at the Speed of AI 2 Follow Me, Follow Me 4 In AI, We Overtrust 6 Area 52 Ramblings 10 I’ll Do It 12 Adversarial Attacks Are Happening 16 ML Systems Don’t Jiggle-Jiggle; They Fold 19 Never Tell Me the Odds 22 AI’s Achilles’ Heel 25 Chapter 2: Salt, Tape, and Split-Second Phantoms 29 Challenge Accepted 30 When Expectation Meets Reality 35 Color Me Blind 39 Translation Fails 42 Attacking AI Systems via Fails 44 Autonomous Trap 001 48 Common Corruption 51 Chapter 3: Subtle, Specific, and Ever-Present 55 Intriguing Properties of Neural Networks 57 They Are Everywhere 60 Research Disciplines Collide 62 Blame Canada 66 The Intelligent Wiggle-Jiggle 71 Bargain-Bin Models Will Do 75 For Whom the Adversarial Example Bell Tolls 79 Chapter 4: Here’s Something I Found on the Web 85 Bad Data = Big Problem 87 Your AI Is Powered by Ghost Workers 88 Your AI Is Powered by Vampire Novels 91 Don’t Believe Everything You Read on the Internet 94 Poisoning the Well 96 The Higher You Climb, the Harder You Fall 104 Chapter 5: Can You Keep a Secret? 107 Why Is Defending Against Adversarial Attacks Hard? 108 Masking Is Important 111 Because It Is Possible 115 Masking Alone Is Not Good Enough 118 An Average Concerned Citizen 119 Security by Obscurity Has Limited Benefit 124 The Opportunity Is Great; the Threat Is Real; the Approach Must Be Bold 125 Swiss Cheese 130 Chapter 6: Sailing for Adventure on the Deep Blue Sea 133 Why Be Securin’ AI Systems So Blasted Hard? An Economics Perspective, Me Hearties! 136 Tis a Sign, Me Mateys 141 Here Be the Most Crucial AI Law Ye’ve Nary Heard Tell Of! 144 Lies, Accursed Lies, and Explanations! 146 No Free Grub 148 Whatcha measure be whatcha get! 151 Who Be Reapin’ the Benefits? 153 Cargo Cult Science 155 Chapter 7: The Big One 159 This Looks Futuristic 161 By All Means, Move at a Glacial Pace; You Know How That Thrills Me 163 Waiting for the Big One 166 Software, All the Way Down 169 The Aftermath 172 Race to AI Safety 173 Happy Story 176 In Medias Res 178 Big-Picture Questions 181 Acknowledgments 185 Index 189
£18.69
APress DataDriven SEO with Python
Book Synopsis Solve SEO problems using data science. This hands-on book is packed with Python code and data science techniques to help you generate data-driven recommendations and automate the SEO workload. This book is a practical, modern introduction to data science in the SEO context using Python. With social media, mobile, changing search engine algorithms, and ever-increasing expectations of users for super web experiences, too much data is generated for an SEO professional to make sense of in spreadsheets. For any modern-day SEO professional to succeed, it is relevant to find an alternate solution, and data science equips SEOs to grasp the issue at hand and solve it. From machine learning to Natural Language Processing (NLP) techniques, Data-Driven SEO with Python provides tried and tested techniques with full explanations for solving both everyday and complex SEO problems. This book is ideal for SEO professionals who want to take their industry skiTable of ContentsData Driven SEO with PythonChapter 1: Meeting the Challenges of SEO with Data1.1 Agents of change in SEO1.2 The Pillars of SEO Strategy1.3 Installing Python1.4 Using Python for SEOChapter 2: Keyword Research2.1 Data Sources2.2 Google Search Console2.4 Google Trends2.5 Google Suggest2.6 Competitor Analytics2.7 SERPsChapter 3: Technical3.1 Improving CTRs3.2 Allocate keywords to pages based on the copy3.3 Allocating parent nodes to the orphaned URLs3.4 Improve interlinking based on copy3.5 Automate Technical AuditsChapter 4: Content & UX4.1 Content that best satisfies the user query4.2 Splitting and merging URLs4.3 Content Strategy: Planning landing page content Chapter 5: Authority5.1 A little SEO history5.1 The source of authority5.2 Finding good linksChapter 6: Competitors6.1 Defining the problem6.2 Data Strategy6.3 Data Sources6.4 Selecting Your Competitors6.5 Get Features6.6 Explore, Clean and Transform6.7 Modelling The SERPS6.8 Evaluating your Model6.9 ActivationChapter 7: Experiments7.1 How experiments fit into the SEO process7.2 Generating Hypotheses7.3 Experiment Design7.4 Running your experiment7.5 Experiment EvaluationChapter 8: Dashboards8.1 Use a Data Layer8.2 Extract, Transform and Load (ETL)8.3 Transform8.4 Querying the Data Warehouse (DW)8.5 Visualization8.6 Making Future ForecastsChapter 9: Site Migrations and Relaunches9.1 Data sources9.2 Establishing the Impact9.3 Segmenting the URLs9.4 Legacy Site URLs9.5 Priority9.6 RoadmapChapter 10: Google Updates10.1 Data sources10.2 Winners and Losers10.3 Quantifying the Impact10.4 Search Intent10.5 Unique URLs10.6 RecommendationsChapter 11: The Future of SEO11.1 Automation11.2 Your journey to SEO science11.3 Suggest resourcesAppendix: CodeGlossaryIndex
£29.69
O'Reilly Media Learning Spark
Book SynopsisUpdated to emphasize new features in Spark 2.4., this second edition shows data engineers and scientists why structure and unification in Spark matters. Specifically, this book explains how to perform simple and complex data analytics and employ machine-learning algorithms.
£47.99
O'Reilly Media Fundamentals of Deep Learning
Book SynopsisThis updated second edition describes the intuition behind deep learning innovations without jargon or complexity. By the end of this book, Python-proficient programmers, software engineering professionals, and computer science majors will be able to re-implement these breakthroughs on their own.
£47.99
O'Reilly Media Introducing MLOps
Book SynopsisThis book introduces the key concepts of MLOps to help data scientists and application engineers not only operationalize ML models to drive real business change but also maintain and improve those models over time.
£39.74
O'Reilly Media Probabilistic Machine Learning for Finance and
Book SynopsisBy moving away from flawed statistical methodologies, you'll move toward an intuitive view of probability as a mathematically rigorous statistical framework that quantifies uncertainty holistically and successfully. This book shows you how.
£47.99
O'Reilly Media Building Recommendation Systems in Python and Jax
Book SynopsisIn this practical book, authors Bryan Bischof and Hector Yee illustrate the core concepts and examples to help you create a RecSys for any industry or scale. You'll learn the math, ideas, and implementation details you need to succeed.
£47.99
Manning Publications Deep Reinforcement Learning in Action
Book SynopsisHumans learn best from feedback—we are encouraged to take actions that lead to positive results while deterred by decisions with negative consequences. This reinforcement process can be applied to computer programs allowing them to solve more complex problems that classical programming cannot. Deep Reinforcement Learning in Action teaches you the fundamental concepts and terminology of deep reinforcement learning, along with the practical skills and techniques you’ll need to implement it into your own projects. Key features • Structuring problems as Markov Decision Processes • Popular algorithms such Deep Q-Networks, Policy Gradient method and Evolutionary Algorithms and the intuitions that drive them • Applying reinforcement learning algorithms to real-world problems Audience You’ll need intermediate Python skills and a basic understanding of deep learning. About the technology Deep reinforcement learning is a form of machine learning in which AI agents learn optimal behavior from their own raw sensory input. The system perceives the environment, interprets the results of its past decisions, and uses this information to optimize its behavior for maximum long-term return. Deep reinforcement learning famously contributed to the success of AlphaGo but that’s not all it can do! Alexander Zai is a Machine Learning Engineer at Amazon AI working on MXNet that powers a suite of AWS machine learning products. Brandon Brown is a Machine Learning and Data Analysis blogger at outlace.com committed to providing clear teaching on difficult topics for newcomers.
£35.99
Manning Publications Inside Deep Learning: Math, Algorithms, Models
Book Synopsis"If you want to learn some of the deeper explanations of deep learning and PyTorch then read this book!" - Tiklu Ganguly Journey through the theory and practice of modern deep learning, and apply innovative techniques to solve everyday data problems. In Inside Deep Learning, you will learn how to: Implement deep learning with PyTorchSelect the right deep learning componentsTrain and evaluate a deep learning modelFine tune deep learning models to maximize performanceUnderstand deep learning terminologyAdapt existing PyTorch code to solve new problems Inside Deep Learning is an accessible guide to implementing deep learning with the PyTorch framework. It demystifies complex deep learning concepts and teaches you to understand the vocabulary of deep learning so you can keep pace in a rapidly evolving field. No detail is skipped—you'll dive into math, theory, and practical applications. Everything is clearly explained in plain English. about the technologyDeep learning isn't just for big tech companies and academics. Anyone who needs to find meaningful insights and patterns in their data can benefit from these practical techniques! The unique ability for your systems to learn by example makes deep learning widely applicable across industries and use-cases, from filtering out spam to driving cars. about the bookInside Deep Learning is a fast-paced beginners' guide to solving common technical problems with deep learning. Written for everyday developers, there are no complex mathematical proofs or unnecessary academic theory. You'll learn how deep learning works through plain language, annotated code and equations as you work through dozens of instantly useful PyTorch examples. As you go, you'll build a French-English translator that works on the same principles as professional machine translation and discover cutting-edge techniques just emerging from the latest research. Best of all, every deep learning solution in this book can run in less than fifteen minutes using free GPU hardware! about the readerFor Python programmers with basic machine learning skills. about the authorEdward Raff is a Chief Scientist at Booz Allen Hamilton, and the author of the JSAT machine learning library. His research includes deep learning, malware detection, reproducibility in ML, fairness/bias, and high performance computing. He is also a visiting professor at the University of Maryland, Baltimore County and teaches deep learning in the Data Science department. Dr Raff has over 40 peer reviewed publications, three best paper awards, and has presented at numerous major conferences.Trade Review“Afantastic book with a colourful and intuitive way of describing how deep learning works.” Richard Vaughan “Amazing at what it does. It's a book for people who not only want to use deep learning, but also understand it!” Adam Slysz “A remarkably clear explanation of practical deep learning showing readers how to quickly and systematically apply deep learning techniques tosolve their everyday data problems.” Jeff Neumann “If you want to learn some of the deeper explanations of deep learning and PyTorch then read this book!” Tiklu Ganguly “A must read if you don't understand how Deep Learning works under the hood.” Abdul Basit Hafeez
£35.99
Manning Publications Evolutionary Deep Learning
Book SynopsisDiscover one-of-a-kind AI strategies never before seen outside of academic papers! Learn how the principles of evolutionary computation overcome deep learning's common pitfalls and deliver adaptable model upgrades without constant manual adjustment. In Evolutionary Deep Learning you will learn how to: Solve complex design and analysis problems with evolutionary computation Tune deep learning hyperparameters with evolutionary computation (EC), genetic algorithms, and particle swarm optimization Use unsupervised learning with a deep learning autoencoder to regenerate sample data Understand the basics of reinforcement learning and the Q Learning equation Apply Q Learning to deep learning to produce deep reinforcement learning Optimize the loss function and network architecture of unsupervised autoencoders Make an evolutionary agent that can play an OpenAI Gym game Evolutionary Deep Learning is a guide to improving your deep learning models with AutoML enhancements based on the principles of biological evolution. This exciting new approach utilizes lesser-known AI approaches to boost performance without hours of data annotation or model hyperparameter tuning. about the technology Evolutionary deep learning merges the biology-simulating practices of evolutionary computation (EC) with the neural networks of deep learning. This unique approach can automate entire DL systems and help uncover new strategies and architectures. It gives new and aspiring AI engineers a set of optimization tools that can reliably improve output without demanding an endless churn of new data. about the reader For data scientists who know Python.
£41.39
Manning Publications Time Series Forecasting in Python
Book SynopsisBuild predictive models from time-based patterns in your data. Master statistical models including new deep learning approaches for time series forecasting. In Time Series Forecasting in Python you will learn how to: Recognize a time series forecasting problem and build a performant predictive model Create univariate forecasting models that account for seasonal effects and external variables Build multivariate forecasting models to predict many time series at once Leverage large datasets by using deep learning for forecasting time series Automate the forecasting process DESCRIPTION Time Series Forecasting in Python teaches you to build powerful predictive models from time-based data. Every model you create is relevant, useful, and easy to implement with Python. You'll explore interesting real-world datasets like Google's daily stock price and economic data for the USA, quickly progressing from the basics to developing large-scale models that use deep learning tools like TensorFlow.Time Series Forecasting in Python teaches you to apply time series forecasting and get immediate, meaningful predictions. You'll learn both traditional statistical and new deep learning models for time series forecasting, all fully illustrated with Python source code. Time Series Forecasting in Python teaches you to build powerful predictive models from time-based data. Every model you create is relevant, useful, and easy to implement with Python. You'll explore interesting real-world datasets like Google's daily stock price and economic data for the USA, quickly progressing from the basics to developing large-scale models that use deep learning tools like TensorFlow. about the technology Time series forecasting reveals hidden trends and makes predictions about the future from your data. This powerful technique has proven incredibly valuable across multiple fields—from tracking business metrics, to healthcare and the sciences. Modern Python libraries and powerful deep learning tools have opened up new methods and utilities for making practical time series forecasts. about the book Time Series Forecasting in Python teaches you to apply time series forecasting and get immediate, meaningful predictions. You'll learn both traditional statistical and new deep learning models for time series forecasting, all fully illustrated with Python source code. Test your skills with hands-on projects for forecasting air travel, volume of drug prescriptions, and the earnings of Johnson & Johnson. By the time you're done, you'll be ready to build accurate and insightful forecasting models with tools from the Python ecosystem.Table of Contentstable of contents detailed TOC PART 1: TIME WAITS FOR NO ONE READ IN LIVEBOOK 1UNDERSTANDING TIME SERIES FORECASTING READ IN LIVEBOOK 2A NAÏVE PREDICTION OF THE FUTURE READ IN LIVEBOOK 3GOING ON A RANDOM WALK PART 2: FORECASTING WITH STATISTICAL MODELS READ IN LIVEBOOK 4MODELING A MOVING AVERAGE PROCESS READ IN LIVEBOOK 5MODELING AN AUTOREGRESSIVE PROCESS READ IN LIVEBOOK 6MODELING COMPLEX TIME SERIES READ IN LIVEBOOK 7FORECASTING NON-STATIONARY TIME SERIES READ IN LIVEBOOK 8ACCOUNTING FOR SEASONALITY READ IN LIVEBOOK 9ADDING EXTERNAL VARIABLES TO OUR MODEL READ IN LIVEBOOK 10FORECASTING MULTIPLE TIME SERIES READ IN LIVEBOOK 11CAPSTONE: FORECASTING THE NUMBER OF ANTIDIABETIC DRUG PRESCRIPTIONS IN AUSTRALIA PART 3: LARGE-SCALE FORECASTING WITH DEEP LEARNING READ IN LIVEBOOK 12INTRODUCING DEEP LEARNING FOR TIME SERIES FORECASTING READ IN LIVEBOOK 13DATA WINDOWING AND CREATING BASELINES FOR DEEP LEARNING READ IN LIVEBOOK 14BABY STEPS WITH DEEP LEARNING READ IN LIVEBOOK 15REMEMBERING THE PAST WITH LSTM READ IN LIVEBOOK 16FILTERING OUR TIME SERIES WITH CNN READ IN LIVEBOOK 17USING PREDICTIONS TO MAKE MORE PREDICTIONS READ IN LIVEBOOK 18CAPSTONE: FORECASTING THE ELECTRIC POWER CONSUMPTION OF A HOUSEHOLD PART 4: AUTOMATING FORECASTING AT SCALE READ IN LIVEBOOK 19AUTOMATING TIME SERIES FORECASTING WITH PROPHET READ IN LIVEBOOK 20CAPSTONE: FORECASTING THE MONTHLY AVERAGE RETAIL PRICE OF STEAK IN CANADA 21 GOING ABOVE AND BEYOND APPENDIX APPENDIX A: INSTALLATION INSTRUCTIONS
£41.39
Manning Publications Bayesian Optimization in Action
Book SynopsisApply advanced techniques for optimising machine learning processes For machine learning practitioners confident in maths and statistics. Bayesian Optimization in Action shows you how to optimise hyperparameter tuning, A/B testing, and other aspects of the machine learning process, by applying cutting-edge Bayesian techniques. Using clear language, Bayesian Optimization helps pinpoint the best configuration for your machine-learning models with speed and accuracy. With a range of illustrations, and concrete examples, this book proves that Bayesian Optimisation doesn't have to be difficult! Key features include: Train Gaussian processes on both sparse and large data sets Combine Gaussian processes with deep neural networks to make them flexible and expressive Find the most successful strategies for hyperparameter tuning Navigate a search space and identify high-performing regions Apply Bayesian Optimisation to practical use cases such as cost-constrained, multi-objective, and preference optimisation Use PyTorch, GPyTorch, and BoTorch to implement Bayesian optimisation You will get in-depth insights into how Bayesian optimisation works and learn how to implement it with cutting-edge Python libraries. The book's easy-to-reuse code samples will let you hit the ground running by plugging them straight into your own projects! About the technology Experimenting in science and engineering can be costly and time-consuming, especially without a reliable way to narrow down your choices. Bayesian Optimisation helps you identify optimal configurations to pursue in a search space. It uses a Gaussian process and machine learning techniques to model an objective function and quantify the uncertainty of predictions. Whether you're tuning machine learning models, recommending products to customers, or engaging in research, Bayesian Optimisation can help you make better decisions faster.
£34.49
Harvard Business Review Press HBR's 10 Must Reads on AI
Book SynopsisThe next generation of AI is here—use it to lead your business forward.If you read nothing else on artificial intelligence and machine learning, read these 10 articles. We've combed through hundreds of Harvard Business Review articles and selected the most important ones to help you understand the future direction of AI, bring your AI initiatives to scale, and use AI to transform your organization.This book will inspire you to: Create a new AI strategy Learn to work with intelligent robots Get more from your marketing AI Be ready for ethical and regulatory challenges Understand how generative AI is game changing Stop tinkering with AI and go all in This collection of articles includes "Competing in the Age of AI," by Marco Iansiti and Karim R. Lakhani; "How to Win with Machine Learning," by Ajay Agrawal, Joshua Gans, and Avi Goldfarb; "Developing a Digital Mindset," by Tsedal Neeley and Paul Leonardi; "Learning to Work with Intelligent Machines," by Matt Beane; "Getting AI to Scale," by Tim Fountaine, Brian McCarthy, and Tamim Saleh; "Why You Aren't Getting More from Your Marketing AI," by Eva Ascarza, Michael Ross, and Bruce G. S. Hardie; "The Pitfalls of Pricing Algorithms," by Marco Bertini and Oded Koenigsberg; "A Smarter Strategy for Using Robots," by Ben Armstrong and Julie Shah; "Why You Need an AI Ethics Committee," by Reid Blackman; "Robots Need Us More Than We Need Them," by H. James Wilson and Paul R. Daugherty; "Stop Tinkering with AI," by Thomas H. Davenport and Nitin Mittal; and "ChatGPT Is a Tipping Point for AI," by Ethan Mollick.HBR's 10 Must Reads paperback series is the definitive collection of books for new and experienced leaders alike. Leaders looking for the inspiration that big ideas provide, both to accelerate their own growth and that of their companies, should look no further. HBR's 10 Must Reads series focuses on the core topics that every ambitious manager needs to know: leadership, strategy, change, managing people, and managing yourself. Harvard Business Review has sorted through hundreds of articles and selected only the most essential reading on each topic. Each title includes timeless advice that will be relevant regardless of an ever‐changing business environment.
£16.14
The Pragmatic Programmers Genetic Algorithms and Machine Learning for
Book SynopsisSelf-driving cars, natural language recognition, and online recommendation engines are all possible thanks to Machine Learning. Now you can create your own genetic algorithms, nature-inspired swarms, Monte Carlo simulations, cellular automata, and clusters. Learn how to test your ML code and dive into even more advanced topics. If you are a beginner-to-intermediate programmer keen to understand machine learning, this book is for you. Discover machine learning algorithms using a handful of self-contained recipes. Build a repertoire of algorithms, discovering terms and approaches that apply generally. Bake intelligence into your algorithms, guiding them to discover good solutions to problems. In this book, you will: Use heuristics and design fitness functions. Build genetic algorithms. Make nature-inspired swarms with ants, bees and particles. Create Monte Carlo simulations. Investigate cellular automata. Find minima and maxima, using hill climbing and simulated annealing. Try selection methods, including tournament and roulette wheels. Learn about heuristics, fitness functions, metrics, and clusters. Test your code and get inspired to try new problems. Work through scenarios to code your way out of a paper bag; an important skill for any competent programmer. See how the algorithms explore and learn by creating visualizations of each problem. Get inspired to design your own machine learning projects and become familiar with the jargon. What You Need: Code in C++ (>= C++11), Python (2.x or 3.x) and JavaScript (using the HTML5 canvas). Also uses matplotlib and some open source libraries, including SFML, Catch and Cosmic-Ray. These plotting and testing libraries are not required but their use will give you a fuller experience. Armed with just a text editor and compiler/interpreter for your language of choice you can still code along from the general algorithm descriptions.
£35.14
Springer International Publishing AG Deep Learning: Foundations and Concepts
Book SynopsisThis book offers a comprehensive introduction to the central ideas that underpin deep learning. It is intended both for newcomers to machine learning and for those already experienced in the field. Covering key concepts relating to contemporary architectures and techniques, this essential book equips readers with a robust foundation for potential future specialization. The field of deep learning is undergoing rapid evolution, and therefore this book focusses on ideas that are likely to endure the test of time.The book is organized into numerous bite-sized chapters, each exploring a distinct topic, and the narrative follows a linear progression, with each chapter building upon content from its predecessors. This structure is well-suited to teaching a two-semester undergraduate or postgraduate machine learning course, while remaining equally relevant to those engaged in active research or in self-study.A full understanding of machine learning requires some mathematical background and so the book includes a self-contained introduction to probability theory. However, the focus of the book is on conveying a clear understanding of ideas, with emphasis on the real-world practical value of techniques rather than on abstract theory. Complex concepts are therefore presented from multiple complementary perspectives including textual descriptions, diagrams, mathematical formulae, and pseudo-code.Chris Bishop is a Technical Fellow at Microsoft and is the Director of Microsoft Research AI4Science. He is a Fellow of Darwin College Cambridge, a Fellow of the Royal Academy of Engineering, and a Fellow of the Royal Society. Hugh Bishop is an Applied Scientist at Wayve, a deep learning autonomous driving company in London, where he designs and trains deep neural networks. He completed his MPhil in Machine Learning and Machine Intelligence at Cambridge University.“Chris Bishop wrote a terrific textbook on neural networks in 1995 and has a deep knowledge of the field and its core ideas. His many years of experience in explaining neural networks have made him extremely skillful at presenting complicated ideas in the simplest possible way and it is a delight to see these skills applied to the revolutionary new developments in the field.” -- Geoffrey Hinton"With the recent explosion of deep learning and AI as a research topic, and the quickly growing importance of AI applications, a modern textbook on the topic was badly needed. The "New Bishop" masterfully fills the gap, covering algorithms for supervised and unsupervised learning, modern deep learning architecture families, as well as how to apply all of this to various application areas." – Yann LeCun“This excellent and very educational book will bring the reader up to date with the main concepts and advances in deep learning with a solid anchoring in probability. These concepts are powering current industrial AI systems and are likely to form the basis of further advances towards artificial general intelligence.” -- Yoshua BengioTable of ContentsPreface 3 1 The Deep Learning Revolution 19 1.1 The Impact of Deep Learning . . . . . . . . . . . . . . . . . . . . 20 1.1.1 Medical diagnosis . . . . . . . . . . . . . . . . . . . . . . 20 1.1.2 Protein structure . . . . . . . . . . . . . . . . . . . . . . . 21 1.1.3 Image synthesis . . . . . . . . . . . . . . . . . . . . . . . . 22 1.1.4 Large language models . . . . . . . . . . . . . . . . . . . . 23 1.2 A Tutorial Example . . . . . . . . . . . . . . . . . . . . . . . . . 24 1.2.1 Synthetic data . . . . . . . . . . . . . . . . . . . . . . . . . 24 1.2.2 Linear models . . . . . . . . . . . . . . . . . . . . . . . . . 26 1.2.3 Error function . . . . . . . . . . . . . . . . . . . . . . . . . 26 1.2.4 Model complexity . . . . . . . . . . . . . . . . . . . . . . 27 1.2.5 Regularization . . . . . . . . . . . . . . . . . . . . . . . . 30 1.2.6 Model selection . . . . . . . . . . . . . . . . . . . . . . . . 32 1.3 A Brief History of Machine Learning . . . . . . . . . . . . . . . . 34 1.3.1 Single-layer networks . . . . . . . . . . . . . . . . . . . . 35 1.3.2 Backpropagation . . . . . . . . . . . . . . . . . . . . . . . 36 1.3.3 Deep networks . . . . . . . . . . . . . . . . . . . . . . . . 38 2 Probabilities 41 2.1 The Rules of Probability . . . . . . . . . . . . . . . . . . . . . . . 43 2.1.1 A medical screening example . . . . . . . . . . . . . . . . 43 2.1.2 The sum and product rules . . . . . . . . . . . . . . . . . . 44 2.1.3 Bayes’ theorem . . . . . . . . . . . . . . . . . . . . . . . . 46 2.1.4 Medical screening revisited . . . . . . . . . . . . . . . . . 48 2.1.5 Prior and posterior probabilities . . . . . . . . . . . . . . . 49 2.1.6 Independent variables . . . . . . . . . . . . . . . . . . . . 49 2.2 Probability Densities . . . . . . . . . . . . . . . . . . . . . . . . . 50 2.2.1 Example distributions . . . . . . . . . . . . . . . . . . . . 51 2.2.2 Expectations and covariances . . . . . . . . . . . . . . . . 52 2.3 The Gaussian Distribution . . . . . . . . . . . . . . . . . . . . . . 54 2.3.1 Mean and variance . . . . . . . . . . . . . . . . . . . . . . 55 2.3.2 Likelihood function . . . . . . . . . . . . . . . . . . . . . . 55 2.3.3 Bias of maximum likelihood . . . . . . . . . . . . . . . . . 57 2.3.4 Linear regression . . . . . . . . . . . . . . . . . . . . . . . 58 2.4 Transformation of Densities . . . . . . . . . . . . . . . . . . . . . 60 2.4.1 Multivariate distributions . . . . . . . . . . . . . . . . . . . 62 2.5 Information Theory . . . . . . . . . . . . . . . . . . . . . . . . . . 64 2.5.1 Entropy . . . . . . . . . . . . . . . . . . . . . . . . . . . . 64 2.5.2 Physics perspective . . . . . . . . . . . . . . . . . . . . . . 65 2.5.3 Differential entropy . . . . . . . . . . . . . . . . . . . . . . 67 2.5.4 Maximum entropy . . . . . . . . . . . . . . . . . . . . . . 68 2.5.5 Kullback–Leibler divergence . . . . . . . . . . . . . . . . . 69 2.5.6 Conditional entropy . . . . . . . . . . . . . . . . . . . . . 71 2.5.7 Mutual information . . . . . . . . . . . . . . . . . . . . . . 72 2.6 Bayesian Probabilities . . . . . . . . . . . . . . . . . . . . . . . . 72 2.6.1 Model parameters . . . . . . . . . . . . . . . . . . . . . . . 73 2.6.2 Regularization . . . . . . . . . . . . . . . . . . . . . . . . 74 2.6.3 Bayesian machine learning . . . . . . . . . . . . . . . . . . 75 Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 76 3 Standard Distributions 83 3.1 Discrete Variables . . . . . . . . . . . . . . . . . . . . . . . . . . 84 3.1.1 Bernoulli distribution . . . . . . . . . . . . . . . . . . . . . 84 3.1.2 Binomial distribution . . . . . . . . . . . . . . . . . . . . . 85 3.1.3 Multinomial distribution . . . . . . . . . . . . . . . . . . . 86 3.2 The Multivariate Gaussian . . . . . . . . . . . . . . . . . . . . . . 88 3.2.1 Geometry of the Gaussian . . . . . . . . . . . . . . . . . . 89 3.2.2 Moments . . . . . . . . . . . . . . . . . . . . . . . . . . . 92 3.2.3 Limitations . . . . . . . . . . . . . . . . . . . . . . . . . . 93 3.2.4 Conditional distribution . . . . . . . . . . . . . . . . . . . 94 3.2.5 Marginal distribution . . . . . . . . . . . . . . . . . . . . . 97 3.2.6 Bayes’ theorem . . . . . . . . . . . . . . . . . . . . . . . . 99 3.2.7 Maximum likelihood . . . . . . . . . . . . . . . . . . . . . 102 3.2.8 Sequential estimation . . . . . . . . . . . . . . . . . . . . . 103 3.2.9 Mixtures of Gaussians . . . . . . . . . . . . . . . . . . . . 104 3.3 Periodic Variables . . . . . . . . . . . . . . . . . . . . . . . . . . 107 3.3.1 Von Mises distribution . . . . . . . . . . . . . . . . . . . . 107 3.4 The Exponential Family . . . . . . . . . . . . . . . . . . . . . . . 112 3.4.1 Sufficient statistics . . . . . . . . . . . . . . . . . . . . . . 115 3.5 Nonparametric Methods . . . . . . . . . . . . . . . . . . . . . . . 116 3.5.1 Histograms . . . . . . . . . . . . . . . . . . . . . . . . . . 116 3.5.2 Kernel densities . . . . . . . . . . . . . . . . . . . . . . . . 118 3.5.3 Nearest-neighbours . . . . . . . . . . . . . . . . . . . . . . 121 Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 123 4 Single-layer Networks: Regression 129 4.1 Linear Regression . . . . . . . . . . . . . . . . . . . . . . . . . . 130 4.1.1 Basis functions . . . . . . . . . . . . . . . . . . . . . . . . 130 4.1.2 Likelihood function . . . . . . . . . . . . . . . . . . . . . . 132 4.1.3 Maximum likelihood . . . . . . . . . . . . . . . . . . . . . 133 4.1.4 Geometry of least squares . . . . . . . . . . . . . . . . . . 135 4.1.5 Sequential learning . . . . . . . . . . . . . . . . . . . . . . 135 4.1.6 Regularized least squares . . . . . . . . . . . . . . . . . . . 136 4.1.7 Multiple outputs . . . . . . . . . . . . . . . . . . . . . . . 137 4.2 Decision theory . . . . . . . . . . . . . . . . . . . . . . . . . . . . 138 4.3 The Bias–Variance Trade-off . . . . . . . . . . . . . . . . . . . . . 141 Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 146 5 Single-layer Networks: Classification 149 5.1 Discriminant Functions . . . . . . . . . . . . . . . . . . . . . . . . 150 5.1.1 Two classes . . . . . . . . . . . . . . . . . . . . . . . . . . 150 5.1.2 Multiple classes . . . . . . . . . . . . . . . . . . . . . . . . 152 5.1.3 1-of-K coding . . . . . . . . . . . . . . . . . . . . . . . . 153 5.1.4 Least squares for classification . . . . . . . . . . . . . . . . 154 5.2 Decision Theory . . . . . . . . . . . . . . . . . . . . . . . . . . . 156 5.2.1 Misclassification rate . . . . . . . . . . . . . . . . . . . . . 157 5.2.2 Expected loss . . . . . . . . . . . . . . . . . . . . . . . . . 158 5.2.3 The reject option . . . . . . . . . . . . . . . . . . . . . . . 160 5.2.4 Inference and decision . . . . . . . . . . . . . . . . . . . . 161 5.2.5 Classifier accuracy . . . . . . . . . . . . . . . . . . . . . . 165 5.2.6 ROC curve . . . . . . . . . . . . . . . . . . . . . . . . . . 166 5.3 Generative Classifiers . . . . . . . . . . . . . . . . . . . . . . . . 168 5.3.1 Continuous inputs . . . . . . . . . . . . . . . . . . . . . . 170 5.3.2 Maximum likelihood solution . . . . . . . . . . . . . . . . 171 5.3.3 Discrete features . . . . . . . . . . . . . . . . . . . . . . . 174 5.3.4 Exponential family . . . . . . . . . . . . . . . . . . . . . . 174 5.4 Discriminative Classifiers . . . . . . . . . . . . . . . . . . . . . . 175 5.4.1 Activation functions . . . . . . . . . . . . . . . . . . . . . 176 5.4.2 Fixed basis functions . . . . . . . . . . . . . . . . . . . . . 176 5.4.3 Logistic regression . . . . . . . . . . . . . . . . . . . . . . 177 5.4.4 Multi-class logistic regression . . . . . . . . . . . . . . . . 179 5.4.5 Probit regression . . . . . . . . . . . . . . . . . . . . . . . 181 5.4.6 Canonical link functions . . . . . . . . . . . . . . . . . . . 182 Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 184 6 Deep Neural Networks 189 6.1 Limitations of Fixed Basis Functions . . . . . . . . . . . . . . . . 190 6.1.1 The curse of dimensionality . . . . . . . . . . . . . . . . . 190 6.1.2 High-dimensional spaces . . . . . . . . . . . . . . . . . . . 193 6.1.3 Data manifolds . . . . . . . . . . . . . . . . . . . . . . . . 194 6.1.4 Data-dependent basis functions . . . . . . . . . . . . . . . 196 6.2 Multilayer Networks . . . . . . . . . . . . . . . . . . . . . . . . . 198 6.2.1 Parameter matrices . . . . . . . . . . . . . . . . . . . . . . 199 6.2.2 Universal approximation . . . . . . . . . . . . . . . . . . . 199 6.2.3 Hidden unit activation functions . . . . . . . . . . . . . . . 200 6.2.4 Weight-space symmetries . . . . . . . . . . . . . . . . . . 203 6.3 Deep Networks . . . . . . . . . . . . . . . . . . . . . . . . . . . . 204 6.3.1 Hierarchical representations . . . . . . . . . . . . . . . . . 205 6.3.2 Distributed representations . . . . . . . . . . . . . . . . . . 205 6.3.3 Representation learning . . . . . . . . . . . . . . . . . . . 206 6.3.4 Transfer learning . . . . . . . . . . . . . . . . . . . . . . . 207 6.3.5 Contrastive learning . . . . . . . . . . . . . . . . . . . . . 209 6.3.6 General network architectures . . . . . . . . . . . . . . . . 211 6.3.7 Tensors . . . . . . . . . . . . . . . . . . . . . . . . . . . . 212 6.4 Error Functions . . . . . . . . . . . . . . . . . . . . . . . . . . . . 212 6.4.1 Regression . . . . . . . . . . . . . . . . . . . . . . . . . . 212 6.4.2 Binary classification . . . . . . . . . . . . . . . . . . . . . 214 6.4.3 multiclass classification . . . . . . . . . . . . . . . . . . . 215 6.5 Mixture Density Networks . . . . . . . . . . . . . . . . . . . . . . 216 6.5.1 Robot kinematics example . . . . . . . . . . . . . . . . . . 216 6.5.2 Conditional mixture distributions . . . . . . . . . . . . . . 217 6.5.3 Gradient optimization . . . . . . . . . . . . . . . . . . . . 219 6.5.4 Predictive distribution . . . . . . . . . . . . . . . . . . . . 220 Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 222 7 Gradient Descent 227 7.1 Error Surfaces . . . . . . . . . . . . . . . . . . . . . . . . . . . . 228 7.1.1 Local quadratic approximation . . . . . . . . . . . . . . . . 229 7.2 Gradient Descent Optimization . . . . . . . . . . . . . . . . . . . 231 7.2.1 Use of gradient information . . . . . . . . . . . . . . . . . 232 7.2.2 Batch gradient descent . . . . . . . . . . . . . . . . . . . . 232 7.2.3 Stochastic gradient descent . . . . . . . . . . . . . . . . . . 232 7.2.4 Mini-batches . . . . . . . . . . . . . . . . . . . . . . . . . 234 7.2.5 Parameter initialization . . . . . . . . . . . . . . . . . . . . 234 7.3 Convergence . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 236 7.3.1 Momentum . . . . . . . . . . . . . . . . . . . . . . . . . . 238 7.3.2 Learning rate schedule . . . . . . . . . . . . . . . . . . . . 240 7.3.3 RMSProp and Adam . . . . . . . . . . . . . . . . . . . . . 241 7.4 Normalization . . . . . . . . . . . . . . . . . . . . . . . . . . . . 242 7.4.1 Data normalization . . . . . . . . . . . . . . . . . . . . . . 244 7.4.2 Batch normalization . . . . . . . . . . . . . . . . . . . . . 245 7.4.3 Layer normalization . . . . . . . . . . . . . . . . . . . . . 247 Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 248 8 Backpropagation 251 8.1 Evaluation of Gradients . . . . . . . . . . . . . . . . . . . . . . . 252 8.1.1 Single-layer networks . . . . . . . . . . . . . . . . . . . . 252 8.1.2 General feed-forward networks . . . . . . . . . . . . . . . 253 8.1.3 A simple example . . . . . . . . . . . . . . . . . . . . . . 256 8.1.4 Numerical differentiation . . . . . . . . . . . . . . . . . . . 257 8.1.5 The Jacobian matrix . . . . . . . . . . . . . . . . . . . . . 258 8.1.6 The Hessian matrix . . . . . . . . . . . . . . . . . . . . . . 260 8.2 Automatic Differentiation . . . . . . . . . . . . . . . . . . . . . . 262 8.2.1 Forward-mode automatic differentiation . . . . . . . . . . . 264 8.2.2 Reverse-mode automatic differentiation . . . . . . . . . . . 267 Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 268 9 Regularization 271 9.1 Inductive Bias . . . . . . . . . . . . . . . . . . . . . . . . . . . . 272 9.1.1 Inverse problems . . . . . . . . . . . . . . . . . . . . . . . 272 9.1.2 No free lunch theorem . . . . . . . . . . . . . . . . . . . . 273 9.1.3 Symmetry and invariance . . . . . . . . . . . . . . . . . . . 274 9.1.4 Equivariance . . . . . . . . . . . . . . . . . . . . . . . . . 277 9.2 Weight Decay . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 278 9.2.1 Consistent regularizers . . . . . . . . . . . . . . . . . . . . 280 9.2.2 Generalized weight decay . . . . . . . . . . . . . . . . . . 282 9.3 Learning Curves . . . . . . . . . . . . . . . . . . . . . . . . . . . 284 9.3.1 Early stopping . . . . . . . . . . . . . . . . . . . . . . . . 284 9.3.2 Double descent . . . . . . . . . . . . . . . . . . . . . . . . 286 9.4 Parameter Sharing . . . . . . . . . . . . . . . . . . . . . . . . . . 288 9.4.1 Soft weight sharing . . . . . . . . . . . . . . . . . . . . . . 289 9.5 Residual Connections . . . . . . . . . . . . . . . . . . . . . . . . 292 9.6 Model Averaging . . . . . . . . . . . . . . . . . . . . . . . . . . . 295 9.6.1 Dropout . . . . . . . . . . . . . . . . . . . . . . . . . . . . 297 Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 299 10 Convolutional Networks 305 10.1 Computer Vision . . . . . . . . . . . . . . . . . . . . . . . . . . . 306 10.1.1 Image data . . . . . . . . . . . . . . . . . . . . . . . . . . 307 10.2 Convolutional Filters . . . . . . . . . . . . . . . . . . . . . . . . . 308 10.2.1 Feature detectors . . . . . . . . . . . . . . . . . . . . . . . 308 10.2.2 Translation equivariance . . . . . . . . . . . . . . . . . . . 309 10.2.3 Padding . . . . . . . . . . . . . . . . . . . . . . . . . . . . 312 10.2.4 Strided convolutions . . . . . . . . . . . . . . . . . . . . . 312 10.2.5 Multi-dimensional convolutions . . . . . . . . . . . . . . . 313 10.2.6 Pooling . . . . . . . . . . . . . . . . . . . . . . . . . . . . 314 10.2.7 Multilayer convolutions . . . . . . . . . . . . . . . . . . . 316 10.2.8 Example network architectures . . . . . . . . . . . . . . . . 317 10.3 Visualizing Trained CNNs . . . . . . . . . . . . . . . . . . . . . . 320 10.3.1 Visual cortex . . . . . . . . . . . . . . . . . . . . . . . . . 320 10.3.2 Visualizing trained filters . . . . . . . . . . . . . . . . . . . 321 10.3.3 Saliency maps . . . . . . . . . . . . . . . . . . . . . . . . 323 10.3.4 Adversarial attacks . . . . . . . . . . . . . . . . . . . . . . 324 10.3.5 Synthetic images . . . . . . . . . . . . . . . . . . . . . . . 326 10.4 Object Detection . . . . . . . . . . . . . . . . . . . . . . . . . . . 326 10.4.1 Bounding boxes . . . . . . . . . . . . . . . . . . . . . . . 327 10.4.2 Intersection-over-union . . . . . . . . . . . . . . . . . . . . 328 10.4.3 Sliding windows . . . . . . . . . . . . . . . . . . . . . . . 329 10.4.4 Detection across scales . . . . . . . . . . . . . . . . . . . . 331 10.4.5 Non-max suppression . . . . . . . . . . . . . . . . . . . . . 332 10.4.6 Fast region CNNs . . . . . . . . . . . . . . . . . . . . . . . 332 10.5 Image Segmentation . . . . . . . . . . . . . . . . . . . . . . . . . 333 10.5.1 Convolutional segmentation . . . . . . . . . . . . . . . . . 333 10.5.2 Up-sampling . . . . . . . . . . . . . . . . . . . . . . . . . 334 10.5.3 Fully convolutional networks . . . . . . . . . . . . . . . . . 336 10.5.4 The U-net architecture . . . . . . . . . . . . . . . . . . . . 337 10.6 Style Transfer . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 338 Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 340 11 Structured Distributions 343 11.1 Graphical Models . . . . . . . . . . . . . . . . . . . . . . . . . . . 344 11.1.1 Directed graphs . . . . . . . . . . . . . . . . . . . . . . . . 344 11.1.2 Factorization . . . . . . . . . . . . . . . . . . . . . . . . . 345 11.1.3 Discrete variables . . . . . . . . . . . . . . . . . . . . . . . 347 11.1.4 Gaussian variables . . . . . . . . . . . . . . . . . . . . . . 350 11.1.5 Binary classifier . . . . . . . . . . . . . . . . . . . . . . . 352 11.1.6 Parameters and observations . . . . . . . . . . . . . . . . . 352 11.1.7 Bayes’ theorem . . . . . . . . . . . . . . . . . . . . . . . . 354 11.2 Conditional Independence . . . . . . . . . . . . . . . . . . . . . . 355 11.2.1 Three example graphs . . . . . . . . . . . . . . . . . . . . 356 11.2.2 Explaining away . . . . . . . . . . . . . . . . . . . . . . . 359 11.2.3 D-separation . . . . . . . . . . . . . . . . . . . . . . . . . 361 11.2.4 Naive Bayes . . . . . . . . . . . . . . . . . . . . . . . . . 362 11.2.5 Generative models . . . . . . . . . . . . . . . . . . . . . . 364 11.2.6 Markov blanket . . . . . . . . . . . . . . . . . . . . . . . . 365 11.2.7 Graphs as filters . . . . . . . . . . . . . . . . . . . . . . . . 366 11.3 Sequence Models . . . . . . . . . . . . . . . . . . . . . . . . . . . 367 11.3.1 Hidden variables . . . . . . . . . . . . . . . . . . . . . . . 370 Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 371 12 Transformers 375 12.1 Attention . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 376 12.1.1 Transformer processing . . . . . . . . . . . . . . . . . . . . 378 12.1.2 Attention coefficients . . . . . . . . . . . . . . . . . . . . . 379 12.1.3 Self-attention . . . . . . . . . . . . . . . . . . . . . . . . . 380 12.1.4 Network parameters . . . . . . . . . . . . . . . . . . . . . 381 12.1.5 Scaled self-attention . . . . . . . . . . . . . . . . . . . . . 384 12.1.6 Multi-head attention . . . . . . . . . . . . . . . . . . . . . 384 12.1.7 Transformer layers . . . . . . . . . . . . . . . . . . . . . . 386 12.1.8 Computational complexity . . . . . . . . . . . . . . . . . . 388 12.1.9 Positional encoding . . . . . . . . . . . . . . . . . . . . . . 389 12.2 Natural Language . . . . . . . . . . . . . . . . . . . . . . . . . . . 392 12.2.1 Word embedding . . . . . . . . . . . . . . . . . . . . . . . 393 12.2.2 Tokenization . . . . . . . . . . . . . . . . . . . . . . . . . 395 12.2.3 Bag of words . . . . . . . . . . . . . . . . . . . . . . . . . 396 12.2.4 Autoregressive models . . . . . . . . . . . . . . . . . . . . 397 12.2.5 Recurrent neural networks . . . . . . . . . . . . . . . . . . 398 12.2.6 Backpropagation through time . . . . . . . . . . . . . . . . 399 12.3 Transformer Language Models . . . . . . . . . . . . . . . . . . . . 400 12.3.1 Decoder transformers . . . . . . . . . . . . . . . . . . . . . 401 12.3.2 Sampling strategies . . . . . . . . . . . . . . . . . . . . . . 404 12.3.3 Encoder transformers . . . . . . . . . . . . . . . . . . . . . 406 12.3.4 Sequence-to-sequence transformers . . . . . . . . . . . . . 408 12.3.5 Large language models . . . . . . . . . . . . . . . . . . . . 408 12.4 Multimodal Transformers . . . . . . . . . . . . . . . . . . . . . . 412 12.4.1 Vision transformers . . . . . . . . . . . . . . . . . . . . . . 413 12.4.2 Generative image transformers . . . . . . . . . . . . . . . . 414 12.4.3 Audio data . . . . . . . . . . . . . . . . . . . . . . . . . . 417 12.4.4 Text-to-speech . . . . . . . . . . . . . . . . . . . . . . . . 418 12.4.5 Vision and language transformers . . . . . . . . . . . . . . 420 Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 421 13 Graph Neural Networks 425 13.1 Machine Learning on Graphs . . . . . . . . . . . . . . . . . . . . 427 13.1.1 Graph properties . . . . . . . . . . . . . . . . . . . . . . . 428 13.1.2 Adjacency matrix . . . . . . . . . . . . . . . . . . . . . . . 428 13.1.3 Permutation equivariance . . . . . . . . . . . . . . . . . . . 429 13.2 Neural Message-Passing . . . . . . . . . . . . . . . . . . . . . . . 430 13.2.1 Convolutional filters . . . . . . . . . . . . . . . . . . . . . 431 13.2.2 Graph convolutional networks . . . . . . . . . . . . . . . . 432 13.2.3 Aggregation operators . . . . . . . . . . . . . . . . . . . . 434 13.2.4 Update operators . . . . . . . . . . . . . . . . . . . . . . . 436 13.2.5 Node classification . . . . . . . . . . . . . . . . . . . . . . 437 13.2.6 Edge classification . . . . . . . . . . . . . . . . . . . . . . 438 13.2.7 Graph classification . . . . . . . . . . . . . . . . . . . . . . 438 13.3 General Graph Networks . . . . . . . . . . . . . . . . . . . . . . . 438 13.3.1 Graph attention networks . . . . . . . . . . . . . . . . . . . 439 13.3.2 Edge embeddings . . . . . . . . . . . . . . . . . . . . . . . 439 13.3.3 Graph embeddings . . . . . . . . . . . . . . . . . . . . . . 440 13.3.4 Over-smoothing . . . . . . . . . . . . . . . . . . . . . . . 440 13.3.5 Regularization . . . . . . . . . . . . . . . . . . . . . . . . 441 13.3.6 Geometric deep learning . . . . . . . . . . . . . . . . . . . 442 Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 443 14 Sampling 447 14.1 Basic Sampling Algorithms . . . . . . . . . . . . . . . . . . . . . 448 14.1.1 Expectations . . . . . . . . . . . . . . . . . . . . . . . . . 448 14.1.2 Standard distributions . . . . . . . . . . . . . . . . . . . . 449 14.1.3 Rejection sampling . . . . . . . . . . . . . . . . . . . . . . 451 14.1.4 Adaptive rejection sampling . . . . . . . . . . . . . . . . . 453 14.1.5 Importance sampling . . . . . . . . . . . . . . . . . . . . . 455 14.1.6 Sampling-importance-resampling . . . . . . . . . . . . . . 457 14.2 Markov Chain Monte Carlo . . . . . . . . . . . . . . . . . . . . . 458 14.2.1 The Metropolis algorithm . . . . . . . . . . . . . . . . . . 459 14.2.2 Markov chains . . . . . . . . . . . . . . . . . . . . . . . . 460 14.2.3 The Metropolis–Hastings algorithm . . . . . . . . . . . . . 463 14.2.4 Gibbs sampling . . . . . . . . . . . . . . . . . . . . . . . . 464 14.2.5 Ancestral sampling . . . . . . . . . . . . . . . . . . . . . . 468 14.3 Langevin Sampling . . . . . . . . . . . . . . . . . . . . . . . . . . 469 14.3.1 Energy-based models . . . . . . . . . . . . . . . . . . . . . 470 14.3.2 Maximizing the likelihood . . . . . . . . . . . . . . . . . . 471 14.3.3 Langevin dynamics . . . . . . . . . . . . . . . . . . . . . . 472 Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 474 15 Discrete Latent Variables 477 15.1 K-means Clustering . . . . . . . . . . . . . . . . . . . . . . . . . 478 15.1.1 Image segmentation . . . . . . . . . . . . . . . . . . . . . 482 15.2 Mixtures of Gaussians . . . . . . . . . . . . . . . . . . . . . . . . 484 15.2.1 Likelihood function . . . . . . . . . . . . . . . . . . . . . . 486 15.2.2 Maximum likelihood . . . . . . . . . . . . . . . . . . . . . 488 15.3 Expectation–Maximization Algorithm . . . . . . . . . . . . . . . . 492 15.3.1 Gaussian mixtures . . . . . . . . . . . . . . . . . . . . . . 496 15.3.2 Relation to K-means . . . . . . . . . . . . . . . . . . . . . 498 15.3.3 Mixtures of Bernoulli distributions . . . . . . . . . . . . . . 499 15.4 Evidence Lower Bound . . . . . . . . . . . . . . . . . . . . . . . 503 15.4.1 EM revisited . . . . . . . . . . . . . . . . . . . . . . . . . 504 15.4.2 Independent and identically distributed data . . . . . . . . . 506 15.4.3 Parameter priors . . . . . . . . . . . . . . . . . . . . . . . 507 15.4.4 Generalized EM . . . . . . . . . . . . . . . . . . . . . . . 507 15.4.5 Sequential EM . . . . . . . . . . . . . . . . . . . . . . . . 508 Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 508 16 Continuous Latent Variables 513 16.1 Principal Component Analysis . . . . . . . . . . . . . . . . . . . . 515 16.1.1 Maximum variance formulation . . . . . . . . . . . . . . . 515 16.1.2 Minimum-error formulation . . . . . . . . . . . . . . . . . 517 16.1.3 Data compression . . . . . . . . . . . . . . . . . . . . . . . 519 16.1.4 Data whitening . . . . . . . . . . . . . . . . . . . . . . . . 520 16.1.5 High-dimensional data . . . . . . . . . . . . . . . . . . . . 522 16.2 Probabilistic Latent Variables . . . . . . . . . . . . . . . . . . . . 524 16.2.1 Generative model . . . . . . . . . . . . . . . . . . . . . . . 524 16.2.2 Likelihood function . . . . . . . . . . . . . . . . . . . . . . 525 16.2.3 Maximum likelihood . . . . . . . . . . . . . . . . . . . . . 527 16.2.4 Factor analysis . . . . . . . . . . . . . . . . . . . . . . . . 531 16.2.5 Independent component analysis . . . . . . . . . . . . . . . 532 16.2.6 Kalman filters . . . . . . . . . . . . . . . . . . . . . . . . . 533 16.3 Evidence Lower Bound . . . . . . . . . . . . . . . . . . . . . . . 534 16.3.1 Expectation maximization . . . . . . . . . . . . . . . . . . 536 16.3.2 EM for PCA . . . . . . . . . . . . . . . . . . . . . . . . . 537 16.3.3 EM for factor analysis . . . . . . . . . . . . . . . . . . . . 538 16.4 Nonlinear Latent Variable Models . . . . . . . . . . . . . . . . . . 540 16.4.1 Nonlinear manifolds . . . . . . . . . . . . . . . . . . . . . 540 16.4.2 Likelihood function . . . . . . . . . . . . . . . . . . . . . . 542 16.4.3 Discrete data . . . . . . . . . . . . . . . . . . . . . . . . . 544 16.4.4 Four approaches to generative modelling . . . . . . . . . . 545 Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 545 17 Generative Adversarial Networks 551 17.1 Adversarial Training . . . . . . . . . . . . . . . . . . . . . . . . . 552 17.1.1 Loss function . . . . . . . . . . . . . . . . . . . . . . . . . 553 17.1.2 GAN training in practice . . . . . . . . . . . . . . . . . . . 554 17.2 Image GANs . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 557 17.2.1 CycleGAN . . . . . . . . . . . . . . . . . . . . . . . . . . 557 Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 562 18 Normalizing Flows 565 18.1 Coupling Flows . . . . . . . . . . . . . . . . . . . . . . . . . . . . 567 18.2 Autoregressive Flows . . . . . . . . . . . . . . . . . . . . . . . . . 570 18.3 Continuous Flows . . . . . . . . . . . . . . . . . . . . . . . . . . 572 18.3.1 Neural differential equations . . . . . . . . . . . . . . . . . 572 18.3.2 Neural ODE backpropagation . . . . . . . . . . . . . . . . 573 18.3.3 Neural ODE flows . . . . . . . . . . . . . . . . . . . . . . 575 Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 577 19 Autoencoders 581 19.1 Deterministic Autoencoders . . . . . . . . . . . . . . . . . . . . . 582 19.1.1 Linear autoencoders . . . . . . . . . . . . . . . . . . . . . 582 19.1.2 Deep autoencoders . . . . . . . . . . . . . . . . . . . . . . 583 19.1.3 Sparse autoencoders . . . . . . . . . . . . . . . . . . . . . 584 19.1.4 Denoising autoencoders . . . . . . . . . . . . . . . . . . . 585 19.1.5 Masked autoencoders . . . . . . . . . . . . . . . . . . . . . 585 19.2 Variational Autoencoders . . . . . . . . . . . . . . . . . . . . . . . 587 19.2.1 Amortized inference . . . . . . . . . . . . . . . . . . . . . 590 19.2.2 The reparameterization trick . . . . . . . . . . . . . . . . . 592 Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 596 20 Diffusion Models 599 20.1 Forward Encoder . . . . . . . . . . . . . . . . . . . . . . . . . . . 600 20.1.1 Diffusion kernel . . . . . . . . . . . . . . . . . . . . . . . 601 20.1.2 Conditional distribution . . . . . . . . . . . . . . . . . . . 602 20.2 Reverse Decoder . . . . . . . . . . . . . . . . . . . . . . . . . . . 603 20.2.1 Training the decoder . . . . . . . . . . . . . . . . . . . . . 605 20.2.2 Evidence lower bound . . . . . . . . . . . . . . . . . . . . 606 20.2.3 Rewriting the ELBO . . . . . . . . . . . . . . . . . . . . . 607 20.2.4 Predicting the noise . . . . . . . . . . . . . . . . . . . . . . 609 20.2.5 Generating new samples . . . . . . . . . . . . . . . . . . . 610 20.3 Score Matching . . . . . . . . . . . . . . . . . . . . . . . . . . . . 612 20.3.1 Score loss function . . . . . . . . . . . . . . . . . . . . . . 613 20.3.2 Modified score loss . . . . . . . . . . . . . . . . . . . . . . 614 20.3.3 Noise variance . . . . . . . . . . . . . . . . . . . . . . . . 615 20.3.4 Stochastic differential equations . . . . . . . . . . . . . . . 616 20.4 Guided Diffusion . . . . . . . . . . . . . . . . . . . . . . . . . . . 617 20.4.1 Classifier guidance . . . . . . . . . . . . . . . . . . . . . . 618 20.4.2 Classifier-free guidance . . . . . . . . . . . . . . . . . . . 618 Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 621 Appendix A Linear Algebra 627 A.1 Matrix Identities . . . . . . . . . . . . . . . . . . . . . . . . . . . 627 A.2 Traces and Determinants . . . . . . . . . . . . . . . . . . . . . . . 628 A.3 Matrix Derivatives . . . . . . . . . . . . . . . . . . . . . . . . . . 629 A.4 Eigenvectors . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 630 Appendix B Calculus of Variations 635 Appendix C Lagrange Multipliers 639 Bibliography 643 Index 659
£62.99
Cambridge University Press Inference and Learning from Data
Book SynopsisThis extraordinary three-volume work, written in an engaging and rigorous style by a world authority in the field, provides an accessible, comprehensive introduction to the full spectrum of mathematical and statistical techniques underpinning contemporary methods in data-driven learning and inference. The first volume, Foundations, establishes core topics in inference and learning, and prepares readers for studying their practical application. The second volume, Inference, introduces readers to cutting-edge techniques for inferring unknown variables and quantities. The final volume, Learning, provides a rigorous introduction to state-of-the-art learning methods. A consistent structure and pedagogy is employed throughout all three volumes to reinforce student understanding, with over 1280 end-of-chapter problems (including solutions for instructors), over 600 figures, over 470 solved examples, datasets and downloadable Matlab code. Unique in its scale and depth, this textbook sequence i
£199.50
McGraw-Hill Education Quantitative Asset Management Factor Investing
Book SynopsisAugment your asset allocation strategy with machine learning and factor investing for unprecedented returns and growthWhether youâre managing institutional portfolios or private wealth, Quantitative Asset Management will open your eyes to a new, more successful way of investingâone that harnesses the power of big data and artificial intelligence.This innovative guide walks you through everything you need to know to fully leverage these revolutionary tools. Written from the perspective of a seasoned financial investor making use of technology, it details proven investing methods, striking a rare balance between providing important technical information without burdening you with overly complex investing theory. Quantitative Asset Management is organized into four thematic sections: Part I reveals invaluable lessons for planning and governance of investment decision-making. Part 2 discusses quantitative financial modeling, covering
£47.19
McGraw-Hill Education Fuzzy Logic Applications in Artificial
Book SynopsisFuzzy logic principles, practices, and real-world applicationsThis hands-on guide offers clear explanations of fuzzy logic along with practical applications and real-world examples. Written by an award-winning engineer, Fuzzy Logic: Applications in Artificial Intelligence, Big Data, and Machine Learning is aimed at improving competence and motivation in students and professionals alike.Inside, you will discover how to apply fuzzy logic in the context of pervasive digitization and big data across emerging technologies which require a very different man-machine relationship than the ones previously used in engineering, science, economics, and social sciences. Applications covered include intelligent energy systems with demand response, smart homes, electrification of transportation, supply chain efficiencies, smart cities, e-commerce, education, healthcare, and decarbonization.Serves as a classroom guide and as an on-the-job resource
£72.89
Dundurn Group Ltd Dancing With Robots
Book SynopsisSurvive and thrive in a world being taken over by robots and other advanced technology. Artificial intelligence, machine learning, algorithms, blockchains, the Internet of Things, big data analytics, 5G networks, self-driving cars, robotics, 3D printing. In the coming years, these technologies, and others to follow, will have a profound and dramatically disruptive impact on how we work and live. Whether we like it or not, we need to develop a good working relationship with these technologies. We need to know how to dance with robots. In Dancing with Robots, futurist, entrepreneur, and innovation coach Bill Bishop describes 29 strategies for success in the New Economy. These new strategies represent a bold, exciting, unexpected, and radically different road map for future success.Bishop also explains how our Five Human Superpowers embodied pattern recognition, unbridled curiosity, purpose-driven ideation, ethical framing, and metaphoric commuTrade ReviewI’ve devoured just about everything Bill Bishop has written. He has the unique ability of distilling complex ideas into simple, doable, and commercial outcomes. Three such ideas I took, and actioned, from his last book were simply to identify a big goal; a big problem; and a signature program to satiate them. * Pádraic Ó Máille, Founder Smácht Training *Bill brings the reader into the future with Dancing with Robots, and prepares the astute leader to shed old world economy beliefs and, with urgency, learn, embrace and put into practice new economy beliefs. * Keith Cupp, Founder Gravitas Impact *Leaders walk the line of dealing with the unknown, building confidence to act in those around them, and operating from principles instead of scripts. Bishop is such a leader. In Dancing with Robots, readers will go to the precipice of the immediate future in four intersecting domains: humanity, technology, thinking, and business. 29 strategies take the edge off of how to embrace artificial intelligence and technology - yes, to dance with it! Readers from the humanities, business, entrepreneurs, healthcare, and futurists will appreciate the nuanced stories and sharp focus that supports each strategy. Highly recommended. * Michael R Bleich, Senior Professor and Director, Virginia Commonwealth University Langston Center for Innovation in Quality and Safety *Bishop’s use of examples, pop culture references, and personal anecdotes to support his points keeps the material relatable and makes for an enjoyable read. * The Miramichi Reader *Dancing With Robots is often inspirational. * The Winnipeg Free Press *Table of ContentsPrefaceIntroduction: The Five Human SuperpowersThe 29 Strategies for Success in the Age of AI and AutomationStrategy No. 1: Increase Well-Being Using Fewer ResourcesStrategy No. 2: Focus First on Who We Want to HelpStrategy No. 3: Build a Value Proposition Around a Big IdeaStrategy No. 4: Grow a Network OrganicallyStrategy No. 5: Ask Purpose-Driven QuestionsStrategy No. 6: Transcend and IntegrateStrategy No. 7: DematerializeStrategy No .8: Mass-CustomizeStrategy No. 9: Facilitate FlowStrategy No. 10: Embrace Radical RealityStrategy No. 11: Tame Our AlgorithmsStrategy No. 12: Go Forth Without BordersStrategy No. 13: Think Big, Start SmallStrategy No. 14: TransformStrategy No. 15: Build a Platform of PlatformsStrategy No. 16: Speak MetaphoricallyStrategy No. 17: Get Paid for Direct ResultsStrategy No. 18: Make Problems a Renewable ResourceStrategy No. 19: Combine Digital with AnalogueStrategy No. 20: Invent the FutureStrategy No. 21: Hold the CentreStrategy No. 22: Pay AttentionStrategy No. 23: Be an Industry OutsiderStrategy No. 24: Co-CreateStrategy No. 25: Frame Everything EthicallyStrategy No. 26: Stop Working So HardStrategy No. 27: Smarten UpStrategy No. 28: Connect with NatureStrategy No. 29: Be HumanConclusion: They Shoot Horses, Don't They?Afterword: Tentacles Acknowledgements BibliographyAbout the Author
£15.29
APress Microsoft Conversational AI Platform for
Book SynopsisIntermediate-Advanced user levelTable of ContentsChapter 1: Introduction to the Microsoft Conversational AI PlatformChapter 2: Introduction to the Microsoft Bot FrameworkChapter 3: Introduction to Azure Cognitive ServicesChapter 4: Design Principles of a ChatbotChapter 5: Building a ChatbotChapter 6: Testing a ChatbotChapter 7: Publishing a ChatbotChapter 8: Connecting a Chatbot with Channels
£37.49
APress Handson Azure Cognitive Services
Book SynopsisIntermediate-Advanced user levelTable of ContentsChapter 1: The Power of Cognitive Services Chapter Goal: This first chapter sets up the values, reasons, and impacts you can achieve through Microsoft Azure Cognitive Services. It provides an overview of the features and capabilities. The chapter also introduces you to our case study and structures that we’ll use throughout the rest of the book. No of pages: 14 Sub - Topics 1. Overview of Azure Cognitive Services 2. Understanding the Use Cases 3. Exploring the Cognitive Services APIs: Vision, Speech, Language, Search, and Decision 4. Overview of Machine Leaning 5. The COVID-19 SmartApp Scenario Chapter 2: The Azure Portal for Cognitive Services Chapter Goal: The aim of this chapter to get started with Microsoft Cognitive services by exploring the Azure Portal. This chapter will explore the Cognitive Azure Portal and some of the common features. Finally, the chapter will take you inside the Azure Marketplace for Bot Service, Cognitive Services, and Machine Learning. No of pages: 18 Sub - Topics 1. Getting started with Azure Portal and Microsoft Cognitive Services 2. Azure Marketplace – an overview of AI + Machine Learning 3. Getting started with Azure Bot Service 4. Understanding software development kits (SDKs) – to get started with a favorite programing language [Ref. https://docs.microsoft.com/en-us/azure/cognitive-services/] 5. Setting up your Visual Studio template Chapter 3: Vision – Identify and Analyze Images and Videos Chapter Goal: This chapter will provide insight on Computer Vision with a full of hands-on example, where we build an application to analyze an Image. There are two features currently in preview that this chapter will also cover: Form Recognizer and Ink Recognizer. No of pages: 24 Sub - Topics 1. Understanding the Vision API with Computer Vision 2. Analyzing images 3. Identifying a face 4. Understanding the working behavior of vision APIs for Video Analysis 5. Recognizing forms, tables, and ink 6. Summary of the Vision API Chapter 4: Language – Gain an Understanding of Unstructured Text and Models Chapter Goal: This chapter will provide insight on NLP (Natural language processing) by evaluating user sentiments. The chapter will also touch preview features – including Immersive Reader. No of pages: 20 Sub - Topics 1. Creating and understanding language models 2. Training language models 3. Translating text to create your own translator application 4. Using QnA Maker to host conversational discussions about your data 5. Using Immersive Reader to understand text via audio and visual cues 6. Summary of the Language API Chapter 5: Speech – Talk to Your Application Chapter Goal: This chapter will provide insight on speech services by evaluating translating text to speech and vice versa. Enabling a speaker and translating into multiple languages. The chapter will also touch a preview feature – Speaker Recognition. The Bing speech feature will not be covered as it is retiring soon. No of pages: 18 Sub - Topics 1. Understanding speech and speech services 2. Converting speech into text and vice versa 3. Translating speech real-time into your application 4. Identifying the speaker from speech using Speaker Recognition 5. Customizing speech 6. Summary of the Speech API Chapter 6: Decision – Make Smarter Decisions In Your Applications Chapter Goal: This chapter will provide insight on decision services by adding content a moderation facility in the application. The chapter will also touch on a preview feature – Anomaly Detector. No of pages: 17 Sub - Topics 1. Understanding the decision service and decision APIs 2. Creating an auto Content Moderator application 3. Creating personalized experiences with the Personalizer 4. Identifying future problems with the Anomaly Detector 5. Summary of the Decision API Chapter 7: Search – Add Search Capabilities to Your Application Chapter Goal: This chapter will provide insight on Bing Search APIs by adding various search functionalities to the application. No of pages: 18 Sub - Topics 1. Understanding search and the Bing Search APIs 2. Creating a smart application by adding Bing Search 3. Suggesting a user with auto suggestions 4. Summary of the Search API Chapter 8: Deploy and Host Services Using Containers Chapter Goal: This chapter will provide a complete insight on Cognitive Services containers. In this chapter, we will highlight the key feature by creating an application. The application will deploy using Docker. No of pages: 22 Sub - Topics 1. Getting started with Cognitive Services containers 2. Understanding deployment and how to deploy and run a container on an Azure container instance 3. Understand Docker compose and use it to deploy multiple containers 4. Understanding Azure Kubernetes Service and how to deploy an application to Azure Kubernetes Service Chapter 9: Azure Bot Service Chapter Goal: This chapter will provide insight on Bot Service by creating the COVID-19 Bot. No of pages: 24 Sub - Topics 1. Understanding Azure Bot services 2. Create a COVID-19 Bot using Azure Bot Service 3. Using the Azure Bot Builder SDK. Reference: https://docs.microsoft.com/en-us/azure/bot-service/dotnet/bot-builder-dotnet-sdk-quickstart?view=azure-bot-service-4.0 Chapter 10: Azure Machine Learning Chapter Goal: This chapter will lead the reader to fully understand Azure Machine Learning and how to use it. You can train your application to learn without being explicitly programmed. We will include forecasts and predictions. The chapter will cover a preview feature – Azure Machine Learning designer. No of pages: 22 Sub - Topics 1. Building models with no-code, using the Azure Machine Learning designer 2. Publishing to Jupyter notebooks 3. Building ML models in Python or R 4. The ML Visual Studio Code extension 5. Commanding the ML CLI 6. Summary of ML
£48.74
APress Synthetic Data for Deep Learning
Book SynopsisData is the indispensable fuel that drives the decision making of everything from governments, to major corporations, to sports teams. Its value is almost beyond measure. But what if that data is either unavailable or problematic to access? That''s where synthetic data comes in. This book will show you how to generate synthetic data and use it to maximum effect.Synthetic Data for Deep Learning begins by tracing the need for and development of synthetic data before delving into the role it plays in machine learning and computer vision. You''ll gain insight into how synthetic data can be used to study the benefits of autonomous driving systems and to make accurate predictions about real-world data. You''ll work through practical examples of synthetic data generation using Python and R, placing its purpose and methods in a real-world context. Generative Adversarial Networks (GANs) are also covered in detail, explaining how they work and their potential applicationsTable of ContentsChapter I: Introduction to Data 40 pagesChapter Goal: The book section entitled "Data" aims to provide readers with information on the history, definition, and future of data storage, as well as the role that synthetic data can play in the field of computer vision. 1.1. The History of Data1.3. Definitions of Synthetic Data1.4. The Lifecycle of Data1.5. The Future of Data Storage1.6. Synthetic Data and Metaverse1.7. Computer Vision1.8. Generating an Artificial Neural Network Using Package “nnet” in R1.9. Understanding of Visual Scenes1.10. Segmentation Problem1.11. Accuracy Problems1.12. Generative Pre-trained Transformer 3 (GPT-3) Chapter 2: Synthetic Data 40 pagesChapter Goal: The purpose of this chapter is to provide information about synthetic data and how it can be used to benefit autonomous driving systems. Synthetic data is a term used to describe data that has been generated by a computer. 2.1. Synthetic Data2.2. A Brief History of Synthetic Data2.3. Types of Synthetic Data2.4. Benefits and Challenges of Synthetic Data2.5. Generating Synthetic Data in A Simple Way2.6. An Example of Biased Synthetic Data Generation2.7. Domain Transfer2.8. Domain Adaptation2.9. Domain Randomization2.10. Using Video Games to Create Synthetic Data2.11. Synthetic Data And Autonomous Driving System2.11.1. Perception2.11.2. Localization2.11.3. Prediction2.11.4. Decision Making2.12. Simulation in Autonomous Vehicle Companies2.13. How to Make Automatic Data Labeling? 2.14. Is Real-World Experience Unavoidable? 2.15. Data for Learning Medical Images2.16. Reinforcement Learning2.17. Self-Supervised LearningChapter 3: Synthetic Data Generation with R..... 55 pagesChapter Goal: The purpose of this book section is to provide information about the content and purpose of synthetic data generation with R. Synthetic data is generated data that is used to mimic real data. There are many reasons why one might want to generate synthetic data. For example, synthetic data can be used to test data-driven models when real data is not available. Synthetic data can also be used to protect the privacy of individuals in data sets.3.1. Basic Functions Used In Generating Synthetic Data3.1.1. Creating a Value Vector from a Known Univariate Distribution3.1.2. Vector Generation from a Multi-levels Categorical Variable3.1.3. Multivariate3.1.4. Multivariate (with correlation) 3.2. Multivariate Imputation Via Mice Package in R3.2.1. Example of MICE3.3. Augmented Data3.4. Image Augmentation Using Torch Package3.5. Generating Synthetic Data with The "conjurer" Package in R3.5.1. Create a Customer3.5.2. Create a Product3.5.3. Creating Transactions3.5.4. Generating Synthetic Data3.6. Generating Synthetic Data With “Synthpop” Package In R3.7. Copula3.7.1. t Copula3.7.2. Normal Copula3.7.3. Gaussian CopulaChapter 4: GANs.... 15 pagesChapter Goal: This book chapter aims to provide information on the content and purpose of GANs. GANs are a type of artificial intelligence that is used to generate new data that is similar to the training data. This is done by training a generator network to produce data that is similar to the training data. The generator network is trained by using a discriminator network, which is used to distinguish between the generated data and the training data. 4.1. GANs4.2. CTGAN4.3. SurfelGAN4.4. Cycle GANs4.5. SinGAN4.6. DCGAN4.7. medGAN4.8. WGAN4.9. seqGAN4.10. Conditional GANChapter 5: Synthetic Data Generation with Python.... 40 pagesChapter Goal: The purpose of this chapter is to provide information about the methods of synthetic data generation with Python. Python is a widely used high-level programming language that is known for its ease of use and readability. It has a large standard library that covers a wide range of programming tasks.5.1. Data Generation with Know Distribution5.2. Synthetic Data Generation in Regression Problem5.3. Gaussian Noise Apply to Regression Model5.4. Friedman Functions and Symbolic Regression5.5. Synthetic data generation for Classification and Clustering Problems5.6. Clustering Problems5.7. Generation Tabular Synthetic Data by Applying GANs
£37.49
APress Productionizing AI
Book SynopsisChapter 1: Introduction to AI & the AI Ecosystem.- Chapter 2: AI Best Practise & DataOps.- Chapter 3: Data Ingestion for AI.- Chapter 4: Machine Learning on Cloud.- Chapter 5: Neural Networks and Deep Learning.- Chapter 6: The Employer's Dream: AutoML, AutoAI and the rise of NoLo UIs.- Chapter 7: AI Full Stack: Application Development.- Chapter 8: AI Case Studies.- Chapter 9: Deploying an AI Solution (Productionizing & Containerization).- Chapter 10: Natural Language Processing.- Postscript.Table of ContentsChapter 1: Introduction to AI & the AI EcosystemChapter Goal: Embracing the hype and the pitfalls, introduces the reader to current and emerging trends in AI and how many businesses and organisations are struggling to get machine and deep learning operationalizedNo of pages: 30Sub -Topics1. The AI ecosystem2. Applications of AI3. AI pipelines4. Machine learning5. Neural networks & deep learning6. Productionizing AIChapter 2: AI Best Practise & DataOpsChapter Goal: Help the reader understand the wider context for AI, key stakeholders, the importance of collaboration, adaptability and re-use as well as DataOps best practice in delivering high-performance solutionsNo of pages: 20Sub - Topics 1. Introduction to DataOps and MLOps 2. Agile development3. Collaboration and adaptability4. Code repositories5. Module 4: Data pipeline orchestration6. CI / CD7. Testing, performance evaluation & monitoringChapter 3: Data Ingestion for AIChapter Goal: Inform on best practice and the right (cloud) data architectures and orchestration requirements to ensure the successful delivery of an AI project.No of pages : 20Sub - Topics: 1. Introduction to data ingestion2. Data stores for AI3. Data lakes, warehousing & streaming4. Data pipeline orchestrationChapter 4: Machine Learning on CloudChapter Goal: Top-down ML model building from design thinking, through high level process, data wrangling, unsupervised clustering techniques, supervised classification, regression and time series approaches before interpreting results and algorithmic performance No of pages: 20Sub - Topics: 1. ML fundamentals2. EDA & data wrangling3. Supervised & unsupervised machine learning4. Python Implementation5. Unsupervised clustering, pattern & anomaly detection6. Supervised classification & regression case studies: churn & retention modelling, risk engines, social media sentiment analysis7. Time series forecasting and comparison with fbprophetChapter 5: Neural Networks and Deep LearningChapter Goal: Help the reader establish the right artificial neural network architecture, data orchestration and infrastructure for deep learning with TensorFlow, Keras and PyTorch on CloudNo of pages: 40Sub - Topics: 1. An introduction to deep learning2. Stochastic processes for deep learning3. Artificial neural networks4. Deep learning tools & frameworks5. Implementing a deep learning model6. Tuning a deep learning model7. Advanced topics in deep learningChapter 6: The Employer’s Dream: AutoML, AutoAI and the rise of NoLo UIsChapter Goal: Building on acquired ML and DL skills, learn to leverage the growing ecosystem of AutoML, AutoAI and No/Low code user interfacesNo of pages: 20Sub - Topics: 1. AutoML2. Optimizing the AI pipeline3. Python-based libraries for automation4. Case Studies in Insurance, HR, FinTech & Trading, Cybersecurity and Healthcare5. Tools for AutoAI: IBM Cloud Pak for Data, Azure Machine Learning, Google Teachable MachinesChapter 7: AI Full Stack: Application Development Chapter Goal: Starting from key business/organizational needs for AI, identify the correct solution and technologies to develop and deliver “Full Stack AI”No of pages: 20Sub - Topics: 6. Introduction to AI application development7. Software for AI development8. Key Business applications of AI:• ML Apps• NLP Apps• DL Apps4. Designing & building an AI applicationChapter 8: AI Case StudiesChapter Goal: A comprehensive (multi-sector, multi-functional) look at the main AI use uses in 2022No of pages: 20Sub - Topics: 1. Industry case studies2. Telco solutions3. Retail solutions4. Banking & financial services / fintech solutions5. Oil & gas / energy & utilities solutions6. Supply chain solutions7. HR solutions8. Healthcare solutions9. Other case studiesChapter 9: Deploying an AI Solution (Productionizing & Containerization)Chapter Goal: A practical look at “joining the dots” with full-stack deployment of Enterprise AI on CloudNo of pages: 20Sub - Topics: 1. Productionizing an AI application2. AutoML / AutoML3. Storage & Compute4. Containerization5. The final frontier…
£41.24
APress Time Series Algorithms Recipes
Book Synopsis Chapter 1: Getting Started with Time Series.- Chapter 2: Statistical Univariate Modelling.- Chapter 3: Statistical Multivariate Modelling.- Chapter 4: Machine Learning Regression-Based Forecasting.- Chapter 5: Forecasting Using Deep Learning. Table of ContentsChapter 1: Getting Started with Time Series.Chapter Goal: Exploring and analyzing the timeseries data, and preprocessing it, which includes feature engineering for model building.No of pages: 25Sub - Topics 1 Reading time series data2 Data cleaning3 EDA4 Trend5 Noise6 Seasonality7 Cyclicity8 Feature Engineering9 StationarityChapter 2: Statistical Univariate ModellingChapter Goal: The fundamentals of time series forecasting with the use of statistical modelling methods like AR, MA, ARMA, ARIMA, etc. No of pages: 25Sub - Topics 1 AR2 MA3 ARMA4 ARIMA5 SARIMA6 AUTO ARIMA7 FBProphetChapter 3: Statistical Multivariate ModellingChapter Goal: implementing multivariate modelling techniques like HoltsWinter and SARIMAX.No of pages: 25Sub - Topics: 1 HoltsWinter 2 ARIMAX3 SARIMAXChapter 4: Machine Learning Regression-Based Forecasting.Chapter Goal: Building and comparing multiple classical ML Regression algorithms for timeseries forecasting.No of pages: 25Sub - Topics: 1 Random Forest2 Decision Tree3 Light GBM4 XGBoost5 SVMChapter 5: Forecasting Using Deep Learning.Chapter Goal: Implementing advanced concepts like deep learning for time series forecasting from scratch.No of pages: 25Sub - Topics: 1 LSTM 2 ANN3 MLP
£22.49
APress Precision Health and Artificial Intelligence
Book SynopsisBeginning user levelTable of ContentsChapter 1: Introduction to Precision Health and Artificial IntelligenceChapter Goal: An introduction to precision health, the concepts of AI-wearables, health data and health tech and how they transform the health industry No of pages: 15Chapter 2: Foundations of Precision HealthChapter Goal: A deep dive into precision health including key principles and processes.No of pages: 25 Chapter 3: DataChapter Goal: Data has been the beginning of many great products, services or ventures in health tech — explore types of data, and how they can be used.No of pages: 25Sub - Topics: 1. Little and big data2. Types of data3. Wearables and IoT, genomics4. Using data to enable precision health Chapter 4: Artificial Intelligence in Precision HealthChapter Goal: Concepts and ideas in artificial intelligence (AI) and machine learning -- including statistical approaches, visualization, human-computer interactions and evaluating health AI.Pages: 251. Statistical approaches2. Visualization3. Human computer interaction4. Evaluations of AIChapter 5: Ethics and RegulatoryChapter Goal: An in-depth study of legal, ethical, and regulatory concepts in precision health.No of pages: 35Sub - Topics:1.Ethics2.Legal3.Regulatory concerns Chapter 6: Case Studies: The Application of Artificial Intelligence in Precision Healthcare and MedicineChapter Goal: Applications of AI techniques and software tools. This will primarily involve exploring recent examples of AI and Machine Learning tools being specifically used to aid in clinical practice.Pages: 251. Best case examples of AI to aid clinical practice
£37.49
APress Exploring the Power of ChatGPT
Book SynopsisLearn how to use the large-scale natural language processing model developed by OpenAI: ChatGPT. This book explains how ChatGPT uses machine learning to autonomously generate text based on user input and explores the significant implications for human communication and interaction. Author Eric Sarrion examines various aspects of ChatGPT, including its internal workings, use in computer projects, and impact on employment and society. He also addresses long-term perspectives for ChatGPT, including possible future advancements, adoption challenges, and considerations for ethical and responsible use. The book starts with an introduction to ChatGPT covering its versions, application areas, how it works with neural networks, NLP, and its advantages and limitations. Next, you'll be introduced to applications and training development projects using ChatGPT, as well as best practices for it. You'll then explore the ethical implications of ChatGPT, such as potentialbiases and risks, regulationTable of ContentsPart 1: Introduction to ChatGPT1 - What is ChatGPT ?Describes hat is ChatGPT, its history...• 1.1 Definition of ChatGPT• 1.2 ChatGPT History• 1.3 Versions of ChatGPT• 1.4 Application areas of ChatGPT2 - How Does ChatGPT Work?Describes how it works inside• 2.1 Neural networks• 2.2 Natural language processing techniques used by ChatGPT• 2.3 The data used to train ChatGPT• 2.4 The advantages and limitations of ChatGPT3 - Applications of ChatGPTDescribes what you can do whith ChatGPT• 3.1 Chatbots and virtual assistants• 3.2 Machine translation apps• 3.3 Content writing apps• 3.4 Applications in information retrievalPart 2: How To Train and Use ChatGPT4 - ChatGPT TrainingDescribes how to build the models used by ChatGPT• 4.1 Data collection and preparation• 4.2 ChatGPT training settings• 4.3 Training tools available• 4.4 Techniques to improve ChatGPT performance5 - Using ChatGPT in Development ProjectsDescribes how to use ChatGPT in a web page with an API• 5.1 Libraries and frameworks for ChatGPT• 5.2 Examples of projects using ChatGPT• 5.3 Techniques to integrate ChatGPT into applications• 5.4 Use ChatGPT with the OpenAI API• 5.5 Use ChatGPT with a voice interface• 5.6 Methods to evaluate the performance of ChatGPT6 - Best Practices for Using ChatGPTDescribes how to optimize ChatGPT• 6.1 Strategies to ensure the quality of input data• 6.2 Techniques to avoid bias in data• 6.3 Methods to optimize ChatGPT performance• 6.4 ChatGPT maintenance tipsPart 3 The Ethical Implications of ChatGPT7 - Potential Biases and Risks of ChatGPTDescribes biases and riks of ChatGPT• 7.1 Sources of bias in the data• 7.2 The risks of discrimination and stigmatization• 7.3 The limits of ChatGPT transparency• 7.4 Consequences for privacy and data security 8 - The Implications of ChatGPT on Employment and SocietyDescribes impacts on employment and society• 8.1 The impacts on employment in various sectors• 8.2 The implications for education and vocational training• 8.3 Consequences for social and cultural norms• 8.4 Political and legal responses to the changes brought about by ChatGPT9 - Regulations and Standards for Using ChatGPTDescribes responsible use of ChatGPT• 9.1 Existing regulations for consumer protection• 9.2 Standards for Responsible Use of ChatGPT• 9.3 ChatGPT governance initiatives• 9.4 Considerations for Legal and Ethical Responsibility of ChatGPTPart 4 Future Prospects of ChatGPT10 - Future Developments of ChatGPTDescribes future developments • 10.1 Advances in Machine Learning and Natural Language Processing Research• 10.2 ChatGPT performance and efficiency improvements• 10.3 Advances in applications and areas of use of ChatGPT• 10.4 Developments in the competition and the ChatGPT market11 - The Long Term Outlook for ChatGPTDescribes long term outlook• 11.1 The implications for artificial intelligence and cognition• 11.2 Merging possibilities between ChatGPT and other emerging technologies• 11.3 The challenges of adopting and accepting ChatGPT• 11.4 Issues for regulation and governance of ChatGPTPart 5 : Examples of Using ChatGPT12 - Using ChatGPT for Text Content Creation13 - Using ChatGPT for Software Programming14 - Using ChatGPT for Text Translation15 - Using ChatGPT for Artistic Content Creation16 - Using ChatGPT for Innovation and Creativity17 - ConclusionGives a conclusion of the book• 17.1 Summaries of the key elements covered in the book• 17.2 Final thoughts on the impact and implications of ChatGPT• 17.3 Suggestions for future research and development on ChatGPT• 17.4 Considerations for the ethical and responsible use of ChatGPT in the future.• 17.5 In conclusion
£26.39
Manning Publications Machine Learning Systems: Designs that scale
Book SynopsisMachine learning applications autonomously reason about data at massive scale. It’s important that they remain responsive in the face of failure and changes in load. But machine learning systems are different than other applications when it comes to testing, building, deploying, and monitoring. Reactive Machine Learning Systems teaches readers how to implement reactive design solutions in their machine learning systems to make them as reliable as a well-built web app. Using Scala and powerful frameworks such as Spark, MLlib, and Akka, they’ll learn to quickly and reliably move from a single machine to a massive cluster. Key Features: · Example-rich guide · Step-by-step guide · Move from single-machine to massive cluster Readers should have intermediate skills in Java or Scala. No previous machine learning experience is required. About the Technology: Machine learning systems are different than other applications when it comes to testing, building, deploying, and monitoring. To make machine learning systems reactive, you need to understand both reactive design patterns and modern data architecture patterns.
£32.39
Manning Publications Machine Learning for Business: Using Amazon
Book Synopsis Imagine predicting which customers are thinking about switching to a competitor or flagging potential process failures before they happen Think about the benefits of automating tedious business processes and back-office tasks Consider the competitive advantage of making decisions when you know the most likely future events Machine learning can deliver these and other advantages to your business, and it’s never been easier to get started! Machine Learning for Business teaches you how to make your company more automated, productive, and competitive by mastering practical, implementable machine learning techniques and tools. Thanks to the authors’ down-to-earth style, you’ll easily grok why process automation is so important and why machine learning is key to its success. In this hands-on guide, you’ll work through seven end-to-end automation scenarios covering business processes in accounts payable, billing, payroll, customer support, and other common tasks. Using Amazon SageMaker (no installation required!), you’ll build and deploy machine learning applications as you practice takeaway skills you’ll use over and over. By the time you’re finished, you’ll confidently identify machine learning opportunities in your company and implement automated applications that can sharpen your competitive edge! Key Features Identifying processes suited to machine learning Using machine learning to automate back office processes Seven everyday business process projects Using open source and cloud-based tools Case studies for machine learning decision making For technically-inclined business professionals or business developers. No previous experience with automation tools or programming is necessary. Doug Hudgeon runs a business automation consultancy, putting his considerable experience helping companies set up automation and machine learning teams to good use. In 2000, Doug launched one of Australia’s first electronic invoicing automation companies. Richard Nichol has over 20 years of experience as a data scientist and software engineer. He currently specializes in maximizing the value of data through AI and machine learning techniques.
£26.99
Manning Publications Ensemble Methods for Machine Learning
Book SynopsisMany machine learning problems are too complex to be resolved by a single model or algorithm. Ensemble machine learning trains a group of diverse machine learning models to work together to solve a problem. By aggregating their output, these ensemble models can flexibly deliver rich and accurate results. Ensemble Methods for Machine Learning is a guide to ensemble methods with proven records in data science competitions and real world applications. Learning from hands-on case studies, you'll develop an under-the-hood understanding of foundational ensemble learning algorithms to deliver accurate, performant models. About the Technology Ensemble machine learning lets you make robust predictions without needing the huge datasets and processing power demanded by deep learning. It sets multiple models to work on solving a problem, combining their results for better performance than a single model working alone. This "wisdom of crowds" approach distils information from several models into a set of highly accurate results.Trade Review"The definitive and complete guide on ensemble learning. A must read!" Al Krinker "The examples are clear and easy to reproduce, the writing is engaging and clear, and the reader is not bogged down by details which might be unimportant for beginners in the field!" Or Golan "This book is a great tutorial on ensemble methods!" Stephen Warnett "The code examples as well as the case studies at the end of each chapter open many possibilities of using these techniques on your data/projects." Joaquin Beltran
£41.39
Manning Publications Automated Machine Learning in Action
Book SynopsisOptimize every stage of your machine learning pipelines with powerful automation components and cutting-edge tools like AutoKeras and KerasTuner. Automated Machine Learning in Action, filled with hands-onexamples and written in an accessible style, reveals how premade machine learning components can automate time-consuming ML tasks. Automated Machine Learning in Action teaches you to automate selecting the best machine learning models or data preparation methods for your own machine learning tasks, so your pipelines tune themselves without needing constant input. You'll quickly run through machine learning basics thatopen upon AutoML to non-data scientists, before putting AutoML into practicefor image classification, supervised learning, and more. Automated machine learning (AutoML) automates complex andtime-consuming stages in a machine learning pipeline with pre packaged optimal solutions. This frees up data scientists from data processing and manualtuning, and lets domain experts easily apply machine learning models to their projects.Trade Review“Automating automation itself is a new concept and this book does justice to it in terms of explaining the concepts, sharing real world advancements, use cases and research related to the topic. “ Satej KumarSahu “A book with a lot of promise, covering a topic that's like to become hot in the next year or so. Read this now, and get ahead of the curve!” RichardVaughan “A nice introduction to AutoML, its ambitions, and challenges bothin theory and in practice.” Alain Couniot “Helps you to clearly understand the process of Machine Learning automation. The examples are clear, concise, and applicable to the real world.”Walter Alexander Mata López “The author's friendly style makes novices feel ready to try outAutoML tools.” Gaurav Kumar Leekha “A great book to take your machine learning skills to the next level.” Harsh Raval “An impressive effort by the authors to break down a complex MLtopic into understandable chunks.” Venkatesh Rajagopal
£34.19
Manning Publications Deep Learning Design Patterns
Book SynopsisDeep learning has revealed ways to create algorithms for applications that we never dreamed were possible. For software developers, the challenge lies in taking cutting-edge technologies from R&D labs through to production. Deep Learning Design Patterns is here to help. In it, you'll find deep learning models presented in a unique new way: as extendable design patterns you can easily plug-and-play into your software projects. Written by Google deep learning expert Andrew Ferlitsch, it's filled with the latest deep learning insights and best practices from his work with Google Cloud AI. Each valuable technique is presented in a way that's easy to understand and filled with accessible diagrams and code samples. about the technologyYou don't need to design your deep learning applications from scratch! By viewing cutting-edge deep learning models as design patterns, developers can speed up their creation of AI models and improve model understandability for both themselves and other users. about the book Deep Learning Design Patterns distills models from the latest research papers into practical design patterns applicable to enterprise AI projects. Using diagrams, code samples, and easy-to-understand language, Google Cloud AI expert Andrew Ferlitsch shares insights from state-of-the-art neural networks. You'll learn how to integrate design patterns into deep learning systems from some amazing examples, including a real-estate program that can evaluate house prices just from uploaded photos and a speaking AI capable of delivering live sports broadcasting. Building on your existing deep learning knowledge, you'll quickly learn to incorporate the very latest models and techniques into your apps as idiomatic, composable, and reusable design patterns. what's inside Internal functioning of modern convolutional neural networks Procedural reuse design pattern for CNN architectures Models for mobile and IoT devices Composable design pattern for automatic learning methods Assembling large-scale model deployments Complete code samples and example notebooks Accompanying YouTube videos about the readerFor machine learning engineers familiar with Python and deep learning. about the author Andrew Ferlitsch is an expert on computer vision and deep learning at Google Cloud AI Developer Relations. He was formerly a principal research scientist for 20 years at Sharp Corporation of Japan, where he amassed 115 US patents and worked on emerging technologies in telepresence, augmented reality, digital signage, and autonomous vehicles. In his present role, he reaches out to developer communities, corporations and universities, teaching deep learning and evangelizing Google's AI technologies.
£43.19
Manning Publications Engineering Deep Learning Systems
Book SynopsisDesign systems optimized for deep learning models. Written for software engineers, this book teaches you how to implement a maintainable platform for developing deep learning models. In Engineering Deep Learning Systems you will learn how to: Transfer your software development skills to deep learning systems Recognize and solve common engineering challenges for deep learning systems Understand the deep learning development cycle Automate training for models in TensorFlow and PyTorch Optimize dataset management, training, model serving and hyperparameter tuning Pick the right open-source project for your platform Engineering Deep Learning Systems is a practical guide for software engineers and data scientists who are designing and building platforms for deep learning. It's full of hands-on examples that will help you transfer your software development skills to implementing deep learning platforms. You'll learn how to build automated and scalable services for core tasks like dataset management, model training/serving, and hyperparameter tuning. This book is the perfect way to step into an exciting—and lucrative—career as a deep learning engineer. about the technology Behind every deep learning researcher is a team of engineers bringing their models to production. To build these systems, you need to understand how a deep learning system's platform differs from other distributed systems. By mastering the core ideas in this book, you'll be able to support deep learning systems in a way that's fast, repeatable, and reliable.
£34.49
Nova Science Publishers Inc Advanced Decision Sciences Based on Deep Learning
Book SynopsisThis book describes the deep learning models and ensemble approaches applied to decision-making problems. The authors have addressed the concepts of deep learning, convolutional neural networks, recurrent neural networks, and ensemble learning in a practical sense providing complete code and implementation for several real-world examples. The authors of this book teach the concepts of machine learning for undergraduate and graduate-level classes and have worked with Fortune 500 clients to formulate data analytics strategies and operationalise these strategies. The book will benefit information professionals, programmers, consultants, professors, students, and industry experts who seek a variety of real-world illustrations with an implementation based on machine learning algorithms.Table of ContentsPreface; Acknowledgements; Introduction; Deep Learning; Convolutional Neural Networks; Recurrent Neural Networks; Ensemble Learning; Implementing DL and Ensemble Learning Models: Real World Use Cases; Appendix; Suggested Reading; About the Authors; Index.
£163.19
Nova Science Publishers Inc Internet of Things and Machine Learning in
Book SynopsisAgriculture is one of the most fundamental human activities. It has kept humans happier and healthier and helped birth modern society as we know it. As farming has expanded, however, the usage of resources such as land, fertilizer, and water has grown exponentially. Environmental pressures from modern farming techniques have stressed our natural landscapes. Still, by some estimates, worldwide food production will need to increase 70% by 2050 to keep up with global demand. With global populations rising, it falls to technology to make farming processes more efficient and keep up with the growing demand. Fortunately, Machine Learning (ML) and the Internet of Things (IoT) can play a very promising role in the agricultural industry. Some examples include: an AI-powered drone to monitor the field, an IoT-designed automated crop watering system, sensors embedded in the field to monitor temperature and humidity, etc. The agriculture industry is the largest in the world, but when it comes to innovation there is a lot more to explore. IoT devices can be used to analyze the status of crops. For instance, with soil sensors, farmers can detect any irregular conditions such as high acidity and efficiently tackle these issues to improve their yield. In this book, we will point out the challenges facing the agro-industry that can be addressed by ML and IoT and explore the impacts of these technologies in the agriculture sector.Table of ContentsPreface; Smart Farming Enabling Technologies: A Systematic Review; Internet of Things Platform for Smart Farming; Internet of Things for Smart Farming; A Comprehensive Review on Intelligent Systems for Mitigating Pests and Diseases in Agriculture; Plant Disease Detection Using Image Sensors: A Step Towards Precision Agriculture; Recent Trends in Agriculture Using IoT, Challenges and Opportunities; Early Detection of Infection/Disease in Agriculture; Application of Agriculture Using IoT: Future Prospective for Smart Cities Management 5.0; The Internet of Things (IoT) for Sustainable Agriculture; IoT Based Data Collection and Data Analytics Decision Making for Precision Farming; Index.
£113.59
Nova Science Publishers Inc Green Computing and Its Applications
Book Synopsis
£163.19
O'Reilly Media Deep Learning at Scale
Book Synopsis
£47.99
Cambridge University Press Exponential Families in Theory and Practice
Book SynopsisDuring the past half-century, exponential families have attained a position at the center of parametric statistical inference. Theoretical advances have been matched, and more than matched, in the world of applications, where logistic regression by itself has become the go-to methodology in medical statistics, computer-based prediction algorithms, and the social sciences. This book is based on a one-semester graduate course for first year Ph.D. and advanced master''s students. After presenting the basic structure of univariate and multivariate exponential families, their application to generalized linear models including logistic and Poisson regression is described in detail, emphasizing geometrical ideas, computational practice, and the analogy with ordinary linear regression. Connections are made with a variety of current statistical methodologies: missing data, survival analysis and proportional hazards, false discovery rates, bootstrapping, and empirical Bayes analysis. The book coTrade Review'This book provides a unique perspective on exponential families, bringing together theory and methods into a unified whole. No other text covers the range of topics in this text. If you want to understand the 'why' as well as the `how' of exponential families, then this book should be on your bookshelf.' Larry Wasserman, Carnegie Mellon University'I am excited to see the publication of this monograph on exponential families by my friend and colleague Brad Efron. I learned some of this material during my Ph.D. studies at Stanford from the maestro himself, as well as the geometry of curved exponential families, Hoeffding's lemma, the Lindsey method, and the list goes on. They have lived with me my entire career and informed our work on GAMs and sparse GLMs. Generations of Stanford students have shared this privilege, and now generations in the future will be able to enjoy the unique Efron style.' Trevor Hastie, Stanford University'Exponential families can be magical in simplifying both theoretical and applied statistical analyses. Brad Efron's wonderful book exposes their secrets, from R. A. Fisher's early magic to Efron's own bootstrap: an essential text for understanding how data of all sizes can be approached scientifically.' Stephen Stigler, University of Chicago'This book provides an original and accessible study of statistical inference in the class of models called exponential families. The mathematical properties and flexibility of this class makes the models very useful for statistical practice – they underpin the class of generalized linear models, for example. Writing with his characteristic elegance and clarity, Efron shows how exponential families underpin, and provide insight into, many modern topics in statistical science, including bootstrap inference, empirical Bayes methodology, high-dimensional inference, analysis of survival data, missing data, and more.' Nancy Reid, University of Toronto'In this book, Brad Efron illuminates the exponential family as a practical, extendible, and crucial ingredient in all manners of data analysis, be they Bayesian, frequentist, or machine learning. He shows us how to shape, understand, and employ these distributions in both algorithms and analysis. The book is crisp, insightful, and indispensable.' David Blei, Columbia UniversityTable of Contents1. One-parameter exponential families; 2. Multiparameter exponential families; 3. Generalized linear models; 4. Curved exponential families, eb, missing data, and the em algorithm; 5. Bootstrap confidence intervals; Bibliography; Index.
£28.49
APress Python Data Analytics
Book Synopsis1. An Introduction to Data Analysis .- 2. Introduction to the Python's World.- 3. The NumPy Library .- 4. The pandas Library-- An Introduction.- 5. pandas: Reading and Writing Data .- 6. pandas in Depth: Data Manipulation .- 7. Data Visualization with matplotlib .- 8. Machine Learning with scikit-learn.- 9. Deep Learning with TensorFlow.- 10. An Example - Meteorological Data.- 11. Embedding the JavaScript D3 Library in IPython Notebook.- 12. Recognizing Handwritten Digits.- 13. Textual data Analysis with NLTK.- 14. Image Analysis and Computer Vision with OpenCV.- Appendix A.- Appendix B.Table of ContentsPython Data Analytics1. An Introduction to Data Analysis 2. Introduction to the Python's World3. The NumPy Library 4. The pandas Library-- An Introduction5. pandas: Reading and Writing Data 6. pandas in Depth: Data Manipulation 7. Data Visualization with matplotlib 8. Machine Learning with scikit-learn9. Deep Learning with TensorFlow10. An Example - Meteorological Data11. Embedding the JavaScript D3 Library in IPython Notebook12. Recognizing Handwritten Digits13. Textual data Analysis with NLTK 14. Image Analysis and Computer Vision with OpenCV Appendix A Appendix B
£46.74
APress Beginning Data Science in R 4
Book SynopsisDiscover best practices for data analysis and software development in R and start on the path to becoming a fully-fledged data scientist. Updated for the R 4.0 release, this book teaches you techniques for both data manipulation and visualization and shows you the best way for developing new software packages for R.Beginning Data Science in R 4, Second Editiondetails how data science is a combination of statistics, computational science, and machine learning. You'll see how to efficiently structure and mine data to extract useful patterns and build mathematical models. This requires computational methods and programming, and R is an ideal programming language for this.Modern data analysis requires computational skills and usually a minimum of programming. After reading and using this book, you'll have what you need to get started with R programming with data science applications. Source code will be available to support your next projects as well. Source code is available at github.cTable of Contents1. Introduction to R programming. 2. Reproducible analysis. 3. Data manipulation. 4. Visualizing and exploring data. 5. Working with large data sets.6. Supervised learning. 7. Unsupervised learning. 8. More R programming.9. Advanced R programming.10. Object oriented programming.11. Building an R package.12. Testing and checking. 13. Version control. 14. Profiling and optimizing.
£37.99
Manning Publications Machine Learning Algorithms in Depth
Book SynopsisDevelop a mathematical intuition around machine learning algorithms to improve model performance and effectively troubleshoot complex ML problems. For intermediate machine learning practitioners familiar with linear algebra, probability, and basic calculus. Machine Learning Algorithms in Depth dives into the design and underlying principles of some of the most exciting machine learning (ML) algorithms in the world today. With a particular emphasis on probability-based algorithms, you will learn the fundamentals of Bayesian inference and deep learning. You will also explore the core data structures and algorithmic paradigms for machine learning. You will explore practical implementations of dozens of ML algorithms, including: Monte Carlo Stock Price Simulation Image Denoising using Mean-Field Variational Inference EM algorithm for Hidden Markov Models Imbalanced Learning, Active Learning and Ensemble Learning Bayesian Optimisation for Hyperparameter Tuning Dirichlet Process K-Means for Clustering Applications Stock Clusters based on Inverse Covariance Estimation Energy Minimisation using Simulated Annealing Image Search based on ResNet Convolutional Neural Network Anomaly Detection in Time-Series using Variational Autoencoders Each algorithm is fully explored with both math and practical implementations so you can see how they work and put into action. About the technology Fully understanding how machine learning algorithms function is essential for any serious ML engineer. This vital knowledge lets you modify algorithms to your specific needs, understand the trade-offs when picking an algorithm for a project, and better interpret and explain your results to your stakeholders. This unique guide will take you from relying on one-size-fits-all ML libraries to developing your own algorithms to solve your business needs.
£54.89
MIT Press Ltd Probabilistic Graphical Models
Book Synopsis
£100.80
HarperCollins Publishers How to Speak Whale
Book SynopsisFascinating' Greta ThunbergExtraordinary' Merlin SheldrakeA must-read' New ScientistEnthralling' George MonbiotBrilliant' Philip HoareAs a biologist and nature filmmaker, Tom Mustill had always liked whales. But when one landed on his kayak, nearly killing him, the video clip of the event going viral, he became obsessed.This book traces an extraordinary investigation into the deep ocean and today's cutting-edge science. Using underwater ears,' robotic fish, big data and machine intelligence, leading scientists and tech-entrepreneurs across the world are working to turn the fantasy of Dr Dolittle into a reality, upending much of what we know about these mysterious creatures. But what would it mean if we were to make contact? Can we hope to one day understand animals? Are we ready for what they might say?Enormously original and hugely entertaining, How to Speak Whale is an unforgettable look at how close we truly are to communicating with another species and how doing so might change our world beyond recognition.Trade Review‘A rich exploration of some of the world's most astonishing creatures … Mustill weaves a narrative that will expand your concept of language and deepen your understanding of the many ways there are to be alive. This is an extraordinary book that left me inspired’ Merlin Sheldrake, author of Entangled Life ‘A must-read… a hugely engaging personal story of a journey into the future of human-animal communication facilitated by delving into its past’ New Scientist ‘[An] extensively researched and energetic book… it is via the informed, far-reaching empathy of intermediaries such as Mustill that we stand our best chance of seeing into the non-human depths’ New Statesman ‘First-class … Reasoned, entertaining, and fact-filled’ Forbes ‘Fascinating and deeply humane’ Greta Thunberg ‘A rich, enthralling, brilliant book that opens our eyes and ears to worlds we can scarcely imagine’George Monbiot, Sunday Times bestselling author of Regenesis ‘Tantalizing … Think how transformative it would be if we could chat with whales about their love lives or their sorrows or their thoughts on the philosophy of language’ Elizabeth Kolbert, New Yorker ‘Mind-blowing … You will never feel closer to the magnificence of whales’Lucy Jones, author of Losing Eden ‘A scary, important and brilliant book … If we do get to translate ‘whale’, will we like what they’ve got to say?’Philip Hoare, author of Leviathan ‘Mustill takes us farther, much farther, than Dr. Dolittle ever imagined’ Carl Safina ‘Riveting … One of the most exciting and hopeful books I have read in ages’ Sy Montgomery, author of The Soul of an Octopus ‘Mustill conveys the richness of whale song and communication’ Frans de Waal ‘Lively and informative’ Jonathan Slaght, author of Owls of the Eastern Ice ‘Extraordinary’ Christiana Figueres
£13.49
Emerald Publishing Limited Learning in Humans and Machines
Book SynopsisDiscusses the analysis, comparison and integration of computational approaches to learning and research on human learning. This book aims to provide the reader with an overview of the prolific research on learning throughout the disciplines. It also highlights the important research issues and methodologies.Trade ReviewEphraim Nissan, University of Greenwich The title of this book accurately describes its editors' ambition: outstretching both arms wide open to get hold of as diverse foci as learning in humans, versus what the discipline of machine learning (ML) within artificial intelligence (AI) actually amounts to in the main...Used properly...this volume can be a trove. A trove of leads to lead you outside the grasp of its compass. To the extent that the book can do that for the reader, it has fulfilled its purpose. No other single book, to my knowledge, would do the same for us on this global subject. Pragmatics & Cognition A certain unity (in this publication's) approach, focusing on the analysis of phenomena in their compexity and developing a "flexible" vision of learning, integrating the role of context, goals and previous knowledge, gives an undeniable coherence to this work. L'Annee PsychologiqueTable of ContentsChapter headings: Towards an Interdisciplinary Learning Science (P. Reimann, H. Spada). A Cognitive Psychological Approach to Learning (S. Vosniadou). Learning to Do and Learning to Understand: A Lesson and a Challenge for Cognitive Modeling (S. Ohlsson). Machine Learning: Case Studies of an Interdisciplinary Approach (W. Emde). Mental and Physical Artifacts in Cognitive Practices (R. Saljo). Learning Theory and Instructional Science (E. De Corte). Knowledge Representation Changes in Humans and Machines (L. Saitta and Task Force 1). Multi-Objective Learning with Multiple Representations (M. Van Someren, P. Reimann). Order Effects in Incremental Learning (P. Langley). Situated Learning and Transfer (H. Gruber et al.). The Evolution of Research on Collaborative Learning (P. Dillenbourg et al.). A Developmental Case Study on Sequential Learning: The Day-Night Cycle (K. Morik, S. Vosniadou). Subject index. Author index.
£87.39
MIT Press Ltd Deep Learning
Book Synopsis
£80.75