Cart
Free US shipping over $10
Proud to be B-Corp

Introduction to Multi-Armed Bandits Aleksandrs Slivkins

Introduction to Multi-Armed Bandits By Aleksandrs Slivkins

Introduction to Multi-Armed Bandits by Aleksandrs Slivkins


$113.69
Condition - New
Only 2 left

Summary

Provides a textbook like treatment of multi-armed bandits. The work on multi-armed bandits can be partitioned into a dozen or so directions. Each chapter tackles one line of work, providing a self-contained introduction and pointers for further reading.

Introduction to Multi-Armed Bandits Summary

Introduction to Multi-Armed Bandits by Aleksandrs Slivkins

Multi-armed bandits is a rich, multi-disciplinary area that has been studied since 1933, with a surge of activity in the past 10-15 years. This is the first monograph to provide a textbook like treatment of the subject.

The work on multi-armed bandits can be partitioned into a dozen or so directions. Each chapter tackles one line of work, providing a self-contained introduction and pointers for further reading. Introduction to Multi-Armed Bandits concentrates on fundamental ideas and elementary, teachable proofs over the strongest possible results. It emphasizes accessibility of the material; while exposure to machine learning and probability/statistics would certainly help, a standard undergraduate course on algorithms should suffice for background.The first four chapters are devoted to IID rewards with adversarial rewards being covered in the next 3 chapters. Contextual bandits are discussed in a separate chapter before the monograph concludes with connections to economics. Each chapter contains a section on bibliographic notes and further directions. Many of the chapters conclude with some exercises.

Introduction to Multi-Armed Bandits provides an accessible treatment for students of a topic that has gained importance in the last decade. Lecturers can use it as a text for an introductory course on the subject.

Table of Contents

  • Preface
  • Introduction: Scope and Motivation
  • 1. Stochastic Bandits
  • 2. Lower Bounds
  • 3. Bayesian Bandits and Thompson Sampling
  • 4. Lipschitz Bandits
  • 5. Full Feedback and Adversarial Costs
  • 6. Adversarial Bandits
  • 7. Linear Costs and Semi-bandits
  • 8. Contextual Bandits
  • 9. Bandits and Games
  • 10. Bandits with Knapsacks
  • 11. Bandits and Incentives
  • Appendices
  • Acknowledgements
  • References

Additional information

NLS9781680836202
9781680836202
168083620X
Introduction to Multi-Armed Bandits by Aleksandrs Slivkins
New
Paperback
now publishers Inc
2019-11-07
306
N/A
Book picture is for illustrative purposes only, actual binding, cover or edition may vary.
This is a new book - be the first to read this copy. With untouched pages and a perfect binding, your brand new copy is ready to be opened for the first time

Customer Reviews - Introduction to Multi-Armed Bandits