AI

Multi-armed bandit experiments in the online service economy

Abstract

The modern service economy is substantively different from the agricultural and manufacturing economies that preceded it. In particular, the cost of experimenting is dominated by opportunity cost rather than the cost of obtaining experimental units. The different economics require a new class of experiments, in which stochastic models play an important role. This article briefly summarizes mulit-armed bandit experiments, where the experimental design is modified as the experiment progresses to make the experiment as inexpensive as possible.