Scaling Multi-Armed Bandit Algorithms

Published in ACM SIGKDD International Conference on Knowledge Discovery and Data Mining (KDD), 2019

This paper addresses scalability challenges in multi-armed bandit algorithms. We develop methods for handling large-scale bandit problems efficiently, providing algorithms that can scale to scenarios with thousands or millions of arms while maintaining theoretical guarantees.

Download paper here

Recommended citation: Fouche, E., Komiyama, J., & Bohm, K. (2019). “Scaling Multi-Armed Bandit Algorithms.” In Proceedings of the 25th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining (KDD 2019).

Recommended citation: Fouche, E., Komiyama, J., & Bohm, K. (2019). "Scaling Multi-Armed Bandit Algorithms." In Proceedings of the 25th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining (KDD 2019).
Download Paper