"Adversarial multi-armed bandit approach to two-person zero-sum Markov games."

Hyeong Soo Chang, Michael C. Fu, Steven I. Marcus (2007)

Details and statistics

DOI: 10.1109/CDC.2007.4434044

access: closed

type: Conference or Workshop Paper

metadata version: 2019-11-20

a service of  Schloss Dagstuhl - Leibniz Center for Informatics