Bandit Algorithms for Website Optimization: Developing, Deploying, and Debugging

Front Cover
"O'Reilly Media, Inc.", Dec 10, 2012 - Computers - 88 pages
0 Reviews

When looking for ways to improve your website, how do you decide which changes to make? And which changes to keep? This concise book shows you how to use Multiarmed Bandit algorithms to measure the real-world value of any modifications you make to your site. Author John Myles White shows you how this powerful class of algorithms can help you boost website traffic, convert visitors to customers, and increase many other measures of success.

This is the first developer-focused book on bandit algorithms, which were previously described only in research papers. You’ll quickly learn the benefits of several simple algorithms—including the epsilon-Greedy, Softmax, and Upper Confidence Bound (UCB) algorithms—by working through code examples written in Python, which you can easily adapt for deployment on your own website.

  • Learn the basics of A/B testing—and recognize when it’s better to use bandit algorithms
  • Develop a unit testing framework for debugging bandit algorithms
  • Get additional code examples written in Julia, Ruby, and JavaScript with supplemental online materials
 

What people are saying - Write a review

We haven't found any reviews in the usual places.

Contents

Exploration and Exploitation
1
Chapter 2 Why Use Multiarmed Bandit Algorithms?
7
Chapter 3 The epsilonGreedy Algorithm
11
Chapter 4 Debugging Bandit Algorithms
21
Chapter 5 The Softmax Algorithm
33
Chapter 6 UCB The Upper Confidence Bound Algorithm
47
Complexity and Complications
59
Chapter 8 Conclusion
69
About the Author
75
Copyright

Other editions - View all

Common terms and phrases

About the author (2012)

John Myles White is a PhD candidate in Psychology at Princeton. He studies pattern recognition, decision-making, and economic behavior using behavioral methods and fMRI. He is particularly interested in anomalies of value assessment.

Bibliographic information