Description

Book Synopsis
The primary goal of this book is to provide a self-contained, comprehensive study of the main ?rst-order methods that are frequently used in solving large-scale problems. First-order methods exploit information on values and gradients/subgradients (but not Hessians) of the functions composing the model under consideration. With the increase in the number of applications that can be modeled as large or even huge-scale optimization problems, there has been a revived interest in using simple methods that require low iteration cost as well as low memory storage.

The author has gathered, reorganized, and synthesized (in a unified manner) many results that are currently scattered throughout the literature, many of which cannot be typically found in optimization books.

First-Order Methods in Optimization offers comprehensive study of first-order methods with the theoretical foundations; provides plentiful examples and illustrations; emphasizes rates of convergence and complexity analysis of the main first-order methods used to solve large-scale problems; and covers both variables and functional decomposition methods.

Table of Contents
  • Preface;
  • Chapter 1: Vector Spaces;
  • Chapter 2: Extended Real-Value Functions;
  • Chapter 3: Subgradients;
  • Chapter 4: Conjugate Functions;
  • Chapter 5: Smoothness and Strong Convexity;
  • Chapter 6: The Proximal Operator;
  • Chapter 7: Spectral Functions;
  • Chapter 8: Primal and Dual Projected Subgradient Methods;
  • Chapter 9: Mirror Descent;
  • Chapter 10: The Proximal Gradient Method;
  • Chapter 11: The Block Proximal Gradient Method;
  • Chapter 12: Dual-Based Proximal Gradient Methods;
  • Chapter 13: The Generalized Conditional Gradient Method;
  • Chapter 14: Alternating Minimization;
  • Chapter 15: ADMM;
  • Appendix A: Strong Duality and Optimality Conditions;
  • Appendix B: Tables;
  • Appendix C: Symbols and Notation;
  • Appendix D: Bibliographic Notes;
  • Bibliography;
  • Index.

First-Order Methods In Optimization

Product form

£86.70

Includes FREE delivery

RRP £102.00 – you save £15.30 (15%)

Order before 4pm tomorrow for delivery by Tue 20 Jan 2026.

A Paperback / softback by Amir Beck

Out of stock


    View other formats and editions of First-Order Methods In Optimization by Amir Beck

    Publisher: Society for Industrial & Applied Mathematics,U.S.
    Publication Date: 30/11/2017
    ISBN13: 9781611974980, 978-1611974980
    ISBN10: 1611974984

    Description

    Book Synopsis
    The primary goal of this book is to provide a self-contained, comprehensive study of the main ?rst-order methods that are frequently used in solving large-scale problems. First-order methods exploit information on values and gradients/subgradients (but not Hessians) of the functions composing the model under consideration. With the increase in the number of applications that can be modeled as large or even huge-scale optimization problems, there has been a revived interest in using simple methods that require low iteration cost as well as low memory storage.

    The author has gathered, reorganized, and synthesized (in a unified manner) many results that are currently scattered throughout the literature, many of which cannot be typically found in optimization books.

    First-Order Methods in Optimization offers comprehensive study of first-order methods with the theoretical foundations; provides plentiful examples and illustrations; emphasizes rates of convergence and complexity analysis of the main first-order methods used to solve large-scale problems; and covers both variables and functional decomposition methods.

    Table of Contents
    • Preface;
    • Chapter 1: Vector Spaces;
    • Chapter 2: Extended Real-Value Functions;
    • Chapter 3: Subgradients;
    • Chapter 4: Conjugate Functions;
    • Chapter 5: Smoothness and Strong Convexity;
    • Chapter 6: The Proximal Operator;
    • Chapter 7: Spectral Functions;
    • Chapter 8: Primal and Dual Projected Subgradient Methods;
    • Chapter 9: Mirror Descent;
    • Chapter 10: The Proximal Gradient Method;
    • Chapter 11: The Block Proximal Gradient Method;
    • Chapter 12: Dual-Based Proximal Gradient Methods;
    • Chapter 13: The Generalized Conditional Gradient Method;
    • Chapter 14: Alternating Minimization;
    • Chapter 15: ADMM;
    • Appendix A: Strong Duality and Optimality Conditions;
    • Appendix B: Tables;
    • Appendix C: Symbols and Notation;
    • Appendix D: Bibliographic Notes;
    • Bibliography;
    • Index.

    Recently viewed products

    © 2026 Book Curl

      • American Express
      • Apple Pay
      • Diners Club
      • Discover
      • Google Pay
      • Maestro
      • Mastercard
      • PayPal
      • Shop Pay
      • Union Pay
      • Visa

      Login

      Forgot your password?

      Don't have an account yet?
      Create account