Description

Book Synopsis
The theory of adaptive control is concerned with construction of strategies so that the controlled system behaves in a desirable way, without assuming the complete knowledge of the system. The models considered in this comprehensive book are of Markovian type. Both partial observation and partial information cases are analyzed. While the book focuses on discrete time models, continuous time ones are considered in the final chapter. The book provides a novel perspective by summarizing results on adaptive control obtained in the Soviet Union, which are not well known in the West. Comments on the interplay between the Russian and Western methods are also included.

Trade Review
"This book is addressed both to students with a good mathematical background and to researchers and specialists in adaptive control who may find the book inspirational." Mathematical Reviews

Table of Contents
# Basic Notions and Definitions # Real-Valued HPIV with Finite Number of Controls: Automaton Approach # Stochastic Approximation # Minimax Adaptive Control # Controlled Finite Homogeneous Markov Chains # Control of Partially Observable Markov Chains and Regenerative Processes # Control of Markov Processes with Discrete Time and Semi-Markov Processes # Control of Stationary Processes # Finite-Converging Procedures for Control Problems with Inequalities # Control of Linear Difference Equations # Control of Ordinary Differential Equations # Control of Stochastic Differential Equations

Mathematical Theory Of Adaptive Control

Product form

£143.10

Includes FREE delivery

RRP £159.00 – you save £15.90 (10%)

Order before 4pm today for delivery by Tue 23 Dec 2025.

A Hardback by I A Sinitzin, Vladimir G Sragovich, Jan Spalinski

Out of stock


    View other formats and editions of Mathematical Theory Of Adaptive Control by I A Sinitzin

    Publisher: World Scientific Publishing Co Pte Ltd
    Publication Date: 29/12/2005
    ISBN13: 9789812563712, 978-9812563712
    ISBN10: 9812563717

    Description

    Book Synopsis
    The theory of adaptive control is concerned with construction of strategies so that the controlled system behaves in a desirable way, without assuming the complete knowledge of the system. The models considered in this comprehensive book are of Markovian type. Both partial observation and partial information cases are analyzed. While the book focuses on discrete time models, continuous time ones are considered in the final chapter. The book provides a novel perspective by summarizing results on adaptive control obtained in the Soviet Union, which are not well known in the West. Comments on the interplay between the Russian and Western methods are also included.

    Trade Review
    "This book is addressed both to students with a good mathematical background and to researchers and specialists in adaptive control who may find the book inspirational." Mathematical Reviews

    Table of Contents
    # Basic Notions and Definitions # Real-Valued HPIV with Finite Number of Controls: Automaton Approach # Stochastic Approximation # Minimax Adaptive Control # Controlled Finite Homogeneous Markov Chains # Control of Partially Observable Markov Chains and Regenerative Processes # Control of Markov Processes with Discrete Time and Semi-Markov Processes # Control of Stationary Processes # Finite-Converging Procedures for Control Problems with Inequalities # Control of Linear Difference Equations # Control of Ordinary Differential Equations # Control of Stochastic Differential Equations

    Recently viewed products

    © 2025 Book Curl

      • American Express
      • Apple Pay
      • Diners Club
      • Discover
      • Google Pay
      • Maestro
      • Mastercard
      • PayPal
      • Shop Pay
      • Union Pay
      • Visa

      Login

      Forgot your password?

      Don't have an account yet?
      Create account