ライブラリ登録: Guest
International Journal for Uncertainty Quantification

年間 6 号発行

ISSN 印刷: 2152-5080

ISSN オンライン: 2152-5099

The Impact Factor measures the average number of citations received in a particular year by papers published in the journal during the two preceding years. 2017 Journal Citation Reports (Clarivate Analytics, 2018) IF: 1.7 To calculate the five year Impact Factor, citations are counted in 2017 to the previous five years and divided by the source items published in the previous five years. 2017 Journal Citation Reports (Clarivate Analytics, 2018) 5-Year IF: 1.9 The Immediacy Index is the average number of times an article is cited in the year it is published. The journal Immediacy Index indicates how quickly articles in a journal are cited. Immediacy Index: 0.5 The Eigenfactor score, developed by Jevin West and Carl Bergstrom at the University of Washington, is a rating of the total importance of a scientific journal. Journals are rated according to the number of incoming citations, with citations from highly ranked journals weighted to make a larger contribution to the eigenfactor than those from poorly ranked journals. Eigenfactor: 0.0007 The Journal Citation Indicator (JCI) is a single measurement of the field-normalized citation impact of journals in the Web of Science Core Collection across disciplines. The key words here are that the metric is normalized and cross-disciplinary. JCI: 0.5 SJR: 0.584 SNIP: 0.676 CiteScore™:: 3 H-Index: 25

Indexed in

AdaAnn: ADAPTIVE ANNEALING SCHEDULER FOR PROBABILITY DENSITY APPROXIMATION

巻 13, 発行 3, 2023, pp. 39-68
DOI: 10.1615/Int.J.UncertaintyQuantification.2022043110
Get accessGet access

要約

Approximating probability distributions can be a challenging task, particularly when they are supported over regions of high geometrical complexity or exhibit multiple modes. Annealing can be used to facilitate this task which is often combined with constant a priori selected increments in inverse temperature. However, using constant increments limits the computational efficiency due to the inability to adapt to situations where smooth changes in the annealed density could be handled equally well with larger increments. We introduce AdaAnn, an adaptive annealing scheduler that automatically adjusts the temperature increments based on the expected change in the Kullback−Leibler divergence between two distributions with a sufficiently close annealing temperature. AdaAnn is easy to implement and can be integrated into existing sampling approaches such as normalizing flows for variational inference and Markov chain Monte Carlo. We demonstrate the computational efficiency of the AdaAnn scheduler for variational inference with normalizing flows on a number of examples, including posterior estimation of parameters for dynamical systems and probability density approximation in multimodal and high-dimensional settings.

参考
  1. Hastings, W.K., Monte Carlo Sampling Methods Using Markov Chains and Their Applications, Biometrika, 57(1):97-109, 1970.

  2. Metropolis, N., Rosenbluth, A.W., Rosenbluth, M.N., Teller, A.H., and Teller, E., Equation of State Calculations by Fast Computing Machines, J. Chem. Phys, 21(6):1087-1092,1953.

  3. Gelfand, A.E. and Smith, A.F., Sampling-Based Approaches to Calculating Marginal Densities, J. Am. Stat. Assoc., 85(410):398-409, 1990.

  4. Neal, R.M., Slice Sampling, Ann. Stat., 31(3):705-767,2003.

  5. Robert, C.P. and Casella, G., Monte Carlo Statistical Methods, Vol. 2, Berlin: Springer, 1999.

  6. Liang, F., Liu, C., and Carroll, R., Advanced Markov Chain Monte Carlo Methods: Learning from past Samples, New York: John Wiley & Sons, 2011.

  7. Bishop, C.M., Pattern Recognition and Machine Learning (Information Science and Statistics), Berlin: Springer-Verlag, 2006.

  8. Blei, D.M., Kucukelbir, A., and McAuliffe, J.D., Variational Inference: A Review for Statisticians, J. Am. Stat. Assoc., 112(518):859-877, 2017.

  9. Jordan, M., Ghahramani, Z., Jaakkola, T., and Saul, L., An Introduction to Variational Methods for Graphical Models, Mach. learn, 37(2):183-233, 1999.

  10. Wainwright, M. and Jordan, M., Graphical Models, Exponential Families, and Variational Inference, Found. Trends. Mach. Learn., 1(1-2):1-305, 2008.

  11. Ranganath, R., Gerrish, S., and Blei, D., Black Box Variational Inference, in Proc. of 17th Int. Conf. on Artificial Intelligence and Statistics, pp. 814-822, 2014.

  12. Rezende, D.J. andMohamed, S., Variational Inference with Normalizing Flows, Stat. Mach. Learn, arXiv:1505.05770,2016.

  13. Dinh, L., Sohl-Dickstein, J., and Bengio, S., Density Estimation Using Real NVP, Comput. Sci. Mach. Learn, arXiv:1605.08803, 2016.

  14. Kingma, D.P., Salimans, T., Jozefowicz, R., Chen, X., Sutskever, I., and Welling, M., Improved Variational Inference with Inverse Autoregressive Flow, Adv. Neural Inf. Process. Syst., 29:4743-4751, 2016.

  15. Papamakarios, G., Pavlakou, T., and Murray, I., Masked Autoregressive Flow for Density Estimation, Stat. Mach. Learn, arXiv:1705.07057, 2018.

  16. Kingma, D.P. and Dhariwal, P., Glow: Generative Flow with Invertible 1x1 Convolutions, Stat. Mach. Learn, arXiv:1807.03039, 2018.

  17. Kobyzev, I., Prince, S., and Brubaker, M., Normalizing Flows: An Introduction and Review of Current Methods, Stat. Mach. Learn, arXiv:1908.09257,2020.

  18. Maronas, J., Hamelijnck, O., Knoblauch, J., and Damoulas, T., Transforming Gaussian Processes with Normalizing Flows, Proc. of the 27th Int. Conf. on Artificial Intelligence and Statistics, pp. 1081-1089,2021.

  19. Liu, J., Kumar, A., Ba, J., Kiros, J., and Swersky, K., Graph Normalizing Flows, Comput. Sci. Mach. Learn, arXiv:1905.13177, 2019.

  20. Yang, G., Huang, X., Hao, Z., Liu, M.Y., Belongie, S., and Hariharan, B., Pointflow: 3D Point Cloud Generation with Continuous Normalizing Flows, in Proc. of the IEEE/CVF Int. Conf. on Computer Vision, pp. 4541-4550, 2019.

  21. Louizos, C. and Welling, M., Multiplicative Normalizing Flows for Variational Bayesian Neural Networks, in Proc. of the 34 Int. Conf. on Machine Learning, pp. 2218-2227, PMLR, 2017.

  22. Izmailov, P., Kirichenko, P., Finzi, M., and Wilson, A.G., Semi-Supervised Learning with Normalizing Flows, Proc. of the 37th Int. Conf. on Machine Learning, pp. 4615-4630, PMLR, 2020.

  23. Wang, Y., Liu, F., and Schiavazzi, D.E., Variational Inference with NoFAS: Normalizing Flow with Adaptive Surrogate for Computationally Expensive Models, J Comput. Phys., 467:111454, 2022.

  24. Whang, J., Lindgren, E., and Dimakis, A., Composing Normalizing Flows for Inverse Problems, Proc. of the 37th Int. Conf. on Machine Learning, pp. 11158-11169, PMLR, 2021.

  25. Kirkpatrick, S., Gelatt, C.D., and Vecchi, M.P., Optimization by Simulated Annealing, Science, 220(4598):671-680, 1983.

  26. Marinari, E. and Parisi, G., Simulated Tempering: A New Monte Carlo Scheme, Europhys. Lett., 19(6):451, 1992.

  27. Neal, R.M., Sampling from Multimodal Distributions Using Tempered Transitions, Stat. Comput, 6(4):353-366, 1996.

  28. Geyer, C.J., Markov Chain Monte Carlo Maximum Likelihood, in Computing Science and Statistics: Proc. of the 23rd Symp. on the Interface, American Statistical Association, New York, pp. 156-163, 1991.

  29. Bhattacharya, A., Pati, D., and Yang, Y., Bayesian Fractional Posteriors, Ann. Stat., 47(1):39-66,2019.

  30. Alquier, P. and Ridgway, J., Concentration of Tempered Posteriors and of Their Variational Approximations, Math. Stat. Theory, arXiv:1706.09293, 2017.

  31. Huang, C.W., Tan, S., Lacoste, A., and Courville, A., Improving Explorability in Variational Inference with Annealed Variational Objectives, Comput. Sci. Mach. Learn, arXiv:1809.01818,2018.

  32. Aarts, E.H. and Korst, J.H., Boltzmann Machines for Travelling Salesman Problems, Eur. J. Oper. Res., 39(1):79-95, 1989.?.

  33. Karabin, M. and Stuart, S.J., Simulated Annealing with Adaptive Cooling Rates, J. Chem. Phys, 153(11):114103, 2020.

  34. Mahdi, W., Medjahed, S.A., and Ouali, M., Performance Analysis of Simulated Annealing Cooling Schedules in the Context of Dense Image Matching, Comput. Sist., 21(3):493-501, 2017.

  35. Kong, Z. and Chaudhuri, K., The Expressive Power of a Class of Normalizing Flow Models, Comput. Sci. Mach. Learn, arXiv:2006.00392, 2020.

  36. Tieleman, T. and Hinton, G., Lecture 6.5-rmsprop, COURS: Neural Networks for Machine Learning, Tech. Rep., University of Toronto, 2012.

  37. Kingma, D.P. and Ba, J., Adam: A Method for Stochastic Optimization, Comput. Sci. Mach. Learn., arXiv:1412.6980, 2017.

  38. Lorenz, E.N., Deterministic Nonperiodic Flow, J. Atmos. Sci., 20(2):130-141, 1963.

  39. Strogatz, S., Nonlinear Dynamics and Chaos: With Applications to Physics, Biology, Chemistry and Engineering (Studies in Nonlinearity), Boulder, CO: Westview Press, 2000.

  40. Vulpiani, A., Cecconi, F., and Cencini, M., Chaos: From Simple Models to Complex Systems, Vol. 17, Singapore: World Scientific, 2009.

  41. Bates, D.J., Hauenstein, J.D., and Meshkat, N., Identifiability and Numerical Algebraic Geometry, PLoS One, 14:1-23, 2019.

  42. Perelson, A., Modelling Viral and Immune System Dynamics, Nat. Rev. Immunol., 2:28-36, 2002.

  43. Friedman, J.H., Multivariate Adaptive Regression Splines, Ann. Stat, 19(1):1-67, 1991.

  44. Gramacy, R.B., tgp: An R Package for Bayesian Nonstationary, Semiparametric Nonlinear Regression and Design by Treed Gaussian Process Models, J. Stat. Software, 19:1-46, 2007.

Begell Digital Portal Begellデジタルライブラリー 電子書籍 ジャーナル 参考文献と会報 リサーチ集 価格及び購読のポリシー Begell House 連絡先 Language English 中文 Русский Português German French Spain