Actions

How to win Vegas-style: To exploit or to explore is the question in consumer behavior modeling in digital advertising and marketing

-


Event
MinneBar 8 (6 April 2013)
Panel
This is not a panel.
Summary
None
URL
Topic
Schedule
2013-04-06 14:50


Room: Learn


Presenters


Video

In probability theory, the multi-armed bandit problem (MAB; sometimes called the K- or N-armed bandit problem) is the problem a gambler faces at a row of slot machines, sometimes known as "one-armed bandits", when deciding which machines to play, how many times to play each machine and in which order to play them.[3] When played, each machine provides a random reward from a distribution specific to that machine. The objective of the gambler is to maximize the sum of rewards earned through a sequence of lever pulls. MAB problems have been studied extensively in Machine Learning, Operations Research and Economics. In practice, multi-armed bandits have been used in many areas. This presentation will review their application in digital advertising while also introducing the keys concepts behind MABs.

blog comments powered by Disqus



MediaWiki spam blocked by CleanTalk.