AI 논문 추천 자료

[인공지능 논문 Review - 08] On Local Optimizers of Acquisition Functions in Bayesian Optimization

On Local Optimizers of Acquisition Functions in Bayesian Optimization

(Jungtaek Kim and Seungjin Choi, ECML-PKDD-2020)


Bayesian optimization is a sample-efficient method for finding a global optimum of an expensive-to-evaluate black-box function.


A global solution is found by accumulating a pair of query point and its function value, repeating these two procedures:

(i) modeling a surrogate function;

(ii) maximizing an acquisition function to determine where next to query.


Convergence guarantees are only valid when the global optimizer of the acquisition function is found at each round and selected as the next query point.In practice, however, local optimizers of an acquisition function are also used, since searching the global optimizer is often a non-trivial or time-consuming task.


This paper considers three popular acquisition functions, PI, EI, and GP-UCB with Gaussian process surrogate model and presents a performance analysis on the behavior of local optimizers of those acquisition functions, in terms of instantaneous regrets over global optimizers. It shows theoretically and empirically a local optimization method to start from multiple different initial conditions provides performance compatible to a global optimization.


논문보기 링크↓

https://arxiv.org/pdf/1901.08350.pdf


최승진 석학의 대표적인 논문

1. Juho Lee, Yoonho Lee, Jungtaek Kim, Adam R. Kosiorek, Seungjin Choi, Yee Whye Teh (2019),
"Set transformer: A framework for attention-based permutation-invariant neural networks,"
Proceedings of the Thirty-Sixth International Conference on Machine Learning (ICML-2019),
Long Beach, California, USA, June 9-15, 2019.
(earlier version in preprint arXiv:1810.00825 )

2. Juho Lee, Lancelot James, Seungjin Choi, and François Caron (2019),
"A Bayesian model for sparse graphs with flexible degree distribution and overlapping community structure,"
Proceedings of the 22nd International Conference on Artificial Intelligence and Statistics (AISTATS-2019),
Naha, Okinawa, Japan, April 16-18, 2019. (oral)
(earlier version in preprint arXiv:1810.01778 )

3. Yoonho Lee and Seungjin Choi (2018),
"Gradient-based meta-learning with adaptive layerwise metric and subspace,"
in Proceedings of the Thirty-Fifth International Conference on Machine Learning (ICML-2018),
Stockholm, Sweden, July 10-15, 2018.
(earlier version in preprint arXiv:1810.05558 )

4. Saehoon Kim, Jungtaek Kim, and Seungjin Choi (2018),
"On the optimal bit complexity of circulant binary embedding,"
in Proceedings of the Thirty-Second AAAI Conference on Artificial Intelligence (AAAI-2018),

5. Juho Lee, Creighton Heaukulani, Zoubin Ghahramani, Lancelot James, and Seungjin Choi (2017),
"Bayesian inference on random simple graphs with power law degree distributions,"
in Proceedings of the International Conference on Machine Learning (ICML-2017),
Sydney, Australia, August 6-11, 2017.
(earlier version in preprint arXiv:1702.08239 )

6. Saehoon Kim and Seungjin Choi (2017),
"Binary embedding with additive homogeneous kernels,"
in Proceedings of the Thirty-First AAAI Conference on Artificial Intelligence (AAAI-2017),
San Francisco, California, USA, February 4-9, 2017.

7. Juho Lee, Lancelot F. James, and Seungjin Choi (2016),
"Finite-dimensional BFRY priors and variational Bayesian inference for power law models,"
in Advances in Neural Information Processing Systems 29 (NIPS-2016),
Barcelona, Spain, December 5-10, 2016.

8. Suwon Suh and Seungjin Choi (2016),
"Gaussian copula variational autoencoders for mixed data,"
Preprint arXiv:1604.04960, 2016.

9. Yong-Deok Kim, Taewoong Jang, Bohyung Han, and Seungjin Choi (2016),
"Learning to select pre-trained deep representations with Bayesian evidence framework,"
in Proceedings of the IEEE International Conference on Computer Vision and Pattern Recognition (CVPR-2016),
Las Vegas, Nevada, USA, June 27-30, 2016. (oral)

10. Juho Lee and Seungjin Choi (2015),
"Tree-guided MCMC inference for normalized random measure mixture models,"
in Advances in Neural Information Processing Systems 28 (NIPS-2015),
Montreal, Canada, December 7-12, 2015.

최승진 석학의 대표적인 논문

1. Juho Lee, Yoonho Lee, Jungtaek Kim, Adam R. Kosiorek, Seungjin Choi, Yee Whye Teh (2019),

"Set transformer: A framework for attention-based permutation-invariant neural networks,"

Proceedings of the Thirty-Sixth International Conference on Machine Learning (ICML-2019),

Long Beach, California, USA, June 9-15, 2019.

(earlier version in preprint arXiv:1810.00825 )


2. Juho Lee, Lancelot James, Seungjin Choi, and François Caron (2019), 

"A Bayesian model for sparse graphs with flexible degree distribution and overlapping community structure,"

Proceedings of the 22nd International Conference on Artificial Intelligence and Statistics (AISTATS-2019),
Naha, Okinawa, Japan, April 16-18, 2019. (oral)
(earlier version in preprint arXiv:1810.01778 )

3. Yoonho Lee and Seungjin Choi (2018), 

"Gradient-based meta-learning with adaptive layerwise metric and subspace,"

in Proceedings of the Thirty-Fifth International Conference on Machine Learning (ICML-2018),
Stockholm, Sweden, July 10-15, 2018.
(earlier version in preprint arXiv:1810.05558 )

4. Saehoon Kim, Jungtaek Kim, and Seungjin Choi (2018), 

"On the optimal bit complexity of circulant binary embedding," 

in Proceedings of the Thirty-Second AAAI Conference on Artificial Intelligence (AAAI-2018),
New Orleans, Louisiana, USA, February 2-7, 2018.

5. Juho Lee, Creighton Heaukulani, Zoubin Ghahramani, Lancelot James, and Seungjin Choi (2017),

"Bayesian inference on random simple graphs with power law degree distributions,"

in Proceedings of the International Conference on Machine Learning (ICML-2017),
Sydney, Australia, August 6-11, 2017.
(earlier version in preprint arXiv:1702.08239 )

6. Saehoon Kim and Seungjin Choi (2017),

"Binary embedding with additive homogeneous kernels,"

in Proceedings of the Thirty-First AAAI Conference on Artificial Intelligence (AAAI-2017),
San Francisco, California, USA, February 4-9, 2017.