Description

Book Synopsis
Optimization techniques are at the core of data science, including data analysis and machine learning. An understanding of basic optimization techniques and their fundamental properties provides important grounding for students, researchers, and practitioners in these areas. This text covers the fundamentals of optimization algorithms in a compact, self-contained way, focusing on the techniques most relevant to data science. An introductory chapter demonstrates that many standard problems in data science can be formulated as optimization problems. Next, many fundamental methods in optimization are described and analyzed, including: gradient and accelerated gradient methods for unconstrained optimization of smooth (especially convex) functions; the stochastic gradient method, a workhorse algorithm in machine learning; the coordinate descent approach; several key algorithms for constrained optimization problems; algorithms for minimizing nonsmooth functions arising in data science; found

Trade Review
'This delightful compact tome gives the reader all the results they should have in their pocket to contribute to optimization and statistical learning. With the clean, elegant derivations of many of the foundational optimization methods underlying modern large-scale data analysis, everyone from students just getting started to researchers knowing this book inside and out will be well-positioned for both using the algorithms and developing new ones for machine learning, optimization, and statistics.' John C. Duchi, Stanford University
'Optimization algorithms play a vital role in the rapidly evolving field of machine learning, as well as in signal processing, statistics and control. Numerical optimization is a vast field, however, and a student wishing to learn the methods required in the world of data science could easily get lost in the literature. This book does a superb job of presenting the most important algorithms, providing both their mathematical foundations and lucid motivations for their development. Written by two of the foremost experts in the field, this book gently guides a reader without prior knowledge of optimization towards the methods and concepts that are central in modern data science applications.' Jorge Nocedal, Northwestern University
'This timely introductory book gives a rigorous view of continuous optimization techniques which are being used in machine learning. It is an excellent resource for those who are interested in understanding the mathematical concepts behind commonly used machine learning techniques.' Shai Shalev-Shwartz, Hebrew University of Jerusalem
'This textbook is a much-needed exposition of optimization techniques, presented with conciseness and precision, with emphasis on topics most relevant for data science and machine learning applications. I imagine that this book will be immensely popular in university courses across the globe, and become a standard reference used by researchers in the area.' Amitabh Basu, Johns Hopkins University

Table of Contents
1. Introduction; 2. Foundations of smooth optimization; 3. Descent methods; 4. Gradient methods using momentum; 5. Stochastic gradient; 6. Coordinate descent; 7. First-order methods for constrained optimization; 8. Nonsmooth functions and subgradients; 9. Nonsmooth optimization methods; 10. Duality and algorithms; 11. Differentiation and adjoints.

Optimization for Data Analysis

Product form

£36.09

Includes FREE delivery

RRP £37.99 – you save £1.90 (5%)

Order before 4pm today for delivery by Wed 14 Jan 2026.

A Hardback by Stephen J. Wright, Benjamin Recht

1 in stock


    View other formats and editions of Optimization for Data Analysis by Stephen J. Wright

    Publisher: Cambridge University Press
    Publication Date: 21/04/2022
    ISBN13: 9781316518984, 978-1316518984
    ISBN10: 1316518981

    Description

    Book Synopsis
    Optimization techniques are at the core of data science, including data analysis and machine learning. An understanding of basic optimization techniques and their fundamental properties provides important grounding for students, researchers, and practitioners in these areas. This text covers the fundamentals of optimization algorithms in a compact, self-contained way, focusing on the techniques most relevant to data science. An introductory chapter demonstrates that many standard problems in data science can be formulated as optimization problems. Next, many fundamental methods in optimization are described and analyzed, including: gradient and accelerated gradient methods for unconstrained optimization of smooth (especially convex) functions; the stochastic gradient method, a workhorse algorithm in machine learning; the coordinate descent approach; several key algorithms for constrained optimization problems; algorithms for minimizing nonsmooth functions arising in data science; found

    Trade Review
    'This delightful compact tome gives the reader all the results they should have in their pocket to contribute to optimization and statistical learning. With the clean, elegant derivations of many of the foundational optimization methods underlying modern large-scale data analysis, everyone from students just getting started to researchers knowing this book inside and out will be well-positioned for both using the algorithms and developing new ones for machine learning, optimization, and statistics.' John C. Duchi, Stanford University
    'Optimization algorithms play a vital role in the rapidly evolving field of machine learning, as well as in signal processing, statistics and control. Numerical optimization is a vast field, however, and a student wishing to learn the methods required in the world of data science could easily get lost in the literature. This book does a superb job of presenting the most important algorithms, providing both their mathematical foundations and lucid motivations for their development. Written by two of the foremost experts in the field, this book gently guides a reader without prior knowledge of optimization towards the methods and concepts that are central in modern data science applications.' Jorge Nocedal, Northwestern University
    'This timely introductory book gives a rigorous view of continuous optimization techniques which are being used in machine learning. It is an excellent resource for those who are interested in understanding the mathematical concepts behind commonly used machine learning techniques.' Shai Shalev-Shwartz, Hebrew University of Jerusalem
    'This textbook is a much-needed exposition of optimization techniques, presented with conciseness and precision, with emphasis on topics most relevant for data science and machine learning applications. I imagine that this book will be immensely popular in university courses across the globe, and become a standard reference used by researchers in the area.' Amitabh Basu, Johns Hopkins University

    Table of Contents
    1. Introduction; 2. Foundations of smooth optimization; 3. Descent methods; 4. Gradient methods using momentum; 5. Stochastic gradient; 6. Coordinate descent; 7. First-order methods for constrained optimization; 8. Nonsmooth functions and subgradients; 9. Nonsmooth optimization methods; 10. Duality and algorithms; 11. Differentiation and adjoints.

    Recently viewed products

    © 2026 Book Curl

      • American Express
      • Apple Pay
      • Diners Club
      • Discover
      • Google Pay
      • Maestro
      • Mastercard
      • PayPal
      • Shop Pay
      • Union Pay
      • Visa

      Login

      Forgot your password?

      Don't have an account yet?
      Create account